Analysis of Web-based Information Architecture in a University Library: Navigating for Known Items

David Robins and Sigrid Kelsey

This paper presents a descriptive study of the Louisiana State University Libraries' Web site. The intent of the study was to gain some idea of user demographics and satisfaction with the site at a given point in time and to test the site's navigation system. We wished to find out who was using the site, why they were using it, and to what extent they were satisfied with the site's navigation. We then assigned tasks (searching for known items) to subjects to better determine the extent to which the site's navigation system facilitated locating information on the site. Evaluation of the navigation system was based on a ratio of correct clicks to the sum of incorrect and back button clicks. This ratio may be compared to some predetermined optimal number of clicks needed to retrieve a known item. The implications of this research are both theoretical and practical. These models of in-house, Web-based information seeking may be used by other institutions of a similar nature that seek to provide useful Web sites for their users as well as to provide a basis for further research on the problem of Web-based development of information retrieval systems.


In fall 1999, the Louisiana State University (LSU) Libraries Web site contained more than six thousand layers: files ending with the extensions html, htm, or txt. The site provides access to databases, full-text journal articles and books, information about the library staff and hours, forms to reserve library classrooms, and more. With usage during the low time of day in 1999 greater than that for the peak time in 1995, it is obvious that the Web site is used more and more frequently for many purposes. In order to facilitate successful research through the libraries' Web site, it is necessary to analyze researchers' behavior when searching the Web site.

The LSU Libraries Web site was redesigned at the start of the spring 1999 semester. Prior to the redesign, the LSU Libraries Webgroup conducted library-wide forums open to all library employees. The Webgroup asked the forum participants to fill out a brief survey, discussing what they thought the major problems of the Web site were. Furthermore, the Webgroup presented various library Web sites to the participants, facilitating a discussion on the methods and styles of other library Web sites, and taking notes on the comments. Finally, the Webgroup presented the participants with several prototypes of a new Web site (first layer only) and solicited feedback.


image

Based on the feedback from the forums, five new prototypes for Web pages were designed, and the Webgroup asked the library employees for more feedback. The Webgroup research resulted in a new design for the LSU Libraries Web page, using a directory structure with seven headings: LOLA--online catalog, Electronic Resources, Key Links, General Information, Services, Library Department and Campus Libraries, and Search the Libraries' Web Pages.

The major problem specified at the forums was navigation: people did not always know where to click to find what they were looking for. The new design attempted to solve this problem in several ways: the new directory structure with related links under each heading made navigation more intuitive; a Search this Site feature, formerly available deeper within the site, was moved forward to the front layer; and an A to Z: Web Site Contents was added under the General Information heading to alphabetically list Web pages and topics likely to be searched for by Web site users.

Shortly after the new design was implemented, a link to the Web survey was added to the home page to survey users about the Web site design, initiating our research.

Problem

The ideal library Web site leads its patrons to whatever information they are seeking in a straightforward and efficient manner. The organization, wording, and content of the Web site are important in leading its users to desired information. Web pages can be organized in a number of ways: a directory structure is one of the most common ways to organize a site. Site maps, tables of contents, and search engines can help organize a Web site and facilitate easy navigation. The Web site developer must decide which of these materials to use, and how to best use them. In addition to the structure of the Web site, the developer must take into consideration the vocabulary on the site, such as library jargon. The following questions must be considered: What is library jargon? Are such terms as "indexes" and "databases" jargon? Some libraries have switched to using phrases such as "look for journal articles here." The library Web developer must decide whether eliminating such terms as "indexes" dumbs down the language of the libraries' site or enhances usability. This paper outlines some protocol library Web site designers can follow to ensure their Web sites are providing contained information in a manner easily accessible to the library patrons.

Related Literature

The practice of creating Web-based information resources for libraries, or any other organization, is a new one. The Web has only been in existence for about twelve years, and the graphical browsers making it accessible to a broad base of users have only been available for about seven years. In addition, it has only been within the last four or five years that Web site development tools have made it possible for developers without sophisticated technological training and experience to create Web sites. The art of Web site development is in its infancy.

User-behavior studies became prevalent in the early nineties, usually regarding CD-ROMs. Many early studies survey the amount of success the users have in searching various databases; some of these discuss redesigning interfaces based on the user behavior. Many of these studies are listed and abstracted in "User's Information-Seeking Behavior: What Are They Really Doing? A Bibliography." 1

Some early studies include the one conducted by Puttapithakporn in 1990, which surveys twenty-three students in an information science class searching ERIC on CD-ROM. 2 The focus of the article presents problems the students encountered. Bucknall and Mangrum study the CD-ROM service at the University of North Carolina at Chapel Hill, concluding that librarians need to be prepared to instruct first-time users as well as respond to complex search questions, and that most users prefer staff assistance to other forms of help. 3 Culbertson's study uses Total Recall, software created by Computer Foundations, to capture the keystrokes of CD-ROM users. 4 Culbertson notes that users rarely use advanced searching options, and recommends user training.

In the mid-nineties, user studies regarding systems created within the library begin to appear in the literature. The conclusions mention redesigning interfaces rather than focusing on better user training. Catledge and Pitkow present an early study of user behavior with recommendations for Web interfaces in their paper "Characterizing Browsing Strategies in the World Wide Web." 5 The authors use a log analysis to recommend interface features, such as placing must-see information within two to three jumps of the initial home page, and using indexes throughout the site.

In a 1998 article, Carter examines the Indiana University-Purdue University, Indianapolis (IUPUI) library Web site interface, initially surveying staff about the site, and following up with a task questionnaire similar to the one we sent out to students. 6 Carter asked library staff familiar with the interface to complete the questionnaire; we asked LIS 1001 students, many of whom had never used the Web site, to complete ours. Carter counted the number of clicks for each task, also similar to our study. Finally, Carter sent out a survey to the library staff comparable to the survey we posted on our Web site. She concludes with some basic tips, such as having a link back to the main menu, and creating a broad menu structure.

Veldof, Prasse, and Mills discuss two types of usability evaluation methods, and recommend a combination of the two: studies with real users, and ones without real users, such as applying a set of heuristics to the design. 7 In our studies, we have combined several types of both methods. Jakob Nielsen's usability engineering principles and the three-click rule are examples of heuristics. 8

While some principles of design are available for librarians to follow as they attempt to provide Web-based information resources for their users, and there are some empirical studies that provide a solid basis for design, the area of research is a new one. Furthermore, academic libraries provide various levels of complexity that a Web site must address. For example, these sites must provide links to bibliographic databases; the library's Online Public Access Catalog (OPAC); general information about the library, such as hours, services, and departments; and internally created resources, such as Web pathfinders for specific subject areas.

General Web design guidelines, or heuristics, offer useful tips and information to begin evaluating a Web site. Nielsen's Top Ten Mistakes in Web Design, for example, provides some clear, simple guidelines every Web site should strive to implement, such as avoiding complex URLs and nonstandard link colors. 9

Rosenfeld and Morville identify a wide variety of issues related to what can be called information architecture (IA). 10 IA is a term that has come to represent efforts to develop best practices for the design of Web-based information sources. Navigation systems, taxonomies, and other organization systems, search systems, labeling systems, and overall site design are all subsets of IA. Our focus here is on navigation, but as Rosenfeld and Morville point out, navigation systems are inextricably tied to such subsystems as labeling and organization. Robins presents another explanation of IA as a tool for records management. 11

Navigation in large Web sites presents challenges for Web site designers. Some of the issues related to navigation include:

labeling: for example, whether one should use precise, technical language, or labels more appropriate for the nontechnical user;

hierarchy and context of navigation systems: which pieces of the navigation system should appear on every page, and which pieces should appear only on some pages; and

breadth and depth of menu systems.

Dietrich, Gordon, and Wexler examine the issue of breadth and depth of Web-based menus and find that users make fewer errors when menu systems are broad rather than deep. 12 Similarly, Morkes and Nielsen find that most Web site users (79 percent in their study) do not read content word by word, but rather scan pages. 13 Presumably, the users are looking for something specific that they suspect might be on the page or something less well-defined (i.e., they may be operating under the assumption that they will know what they are looking for when they see it). In any case, a broad menu system allows a user to easily scan its contents for the path to desired information by allowing a user to rely on recognition of the desired link as opposed to attempting to recall or decipher a more general heading.

Based on our findings of similar studies and related writings, we decided to test the effectiveness of LSU Libraries' navigation system design.


Research Design

In order to study the effectiveness of our Web site's navigation system, we chose to:

conduct a Web survey to determine the current attitudes toward site usability; and

develop and implement a series of navigation tasks for undergraduate students enrolled in basic library research courses.

The task's design is based largely on principles put forth by Nielsen. 14

Step 1: Usability Survey

In order to obtain a general notion of users' attitudes about the current site's usability, we developed an instrument to elicit information from users about their experience using the library system's Web site. The instrument we used was based on questions found on a standard software usability instrument, the Software Usability Measurement Inventory (SUMI). 15 SUMI is designed to evaluate productivity applications such as word-processors and spreadsheets.

Because we were studying the usability of a Web site, with which users interact somewhat differently than they do with productivity software, we made certain modifications to the instrument. For example, SUMI asks subjects to respond "agree," "undecided," or "disagree" to the following statement: "It takes too long to learn the software commands." In this case we decided not to use the statement in the survey because there are no commands to learn on the Web site. Similarly, we substituted "Web site" for "software" to make the statements more appropriate for our purposes. Finally, we chose not to use SUMI's three-tiered response system. Using the instrument, the respondents rated their level of agreement on a Likert scale of 1 to 5 (1 = strongly agree, 5 = strongly disagree). We chose to have subjects scale their responses so that we could get a more general picture of their attitudes toward the site's usability. We hoped to gain from the administration of the instrument a better understanding of navigation and discovery, design, and overall feelings.

Items in the navigation and discovery section sought to elicit attitudes about the respondents' ability to find what they were looking for. Furthermore, we wanted to find out if users found unexpected, serendipitous information resources by using the site. Finally, some of the items in this section sought to find out if the terminology used on the site was a problem. The second category of survey items dealt with the site's design: how fast the site loaded; the consistency of layout, graphics, and headings; and colors. We wanted to know if the Web site looked and felt like other Web sites with which people were familiar. The third category attempted to obtain users' overall attitudes and feelings about using the Web site. Table 1 shows the survey instrument with items listed according to purpose.

Table 1. Questionnaire Sorted by Question Type


Type

#

Item

D

1.

I like the menu system on this Web site

D

2.

This Web site is slow

D

3.

The color combinations on this Web site should be changed

D

4.

This Web site "behaves" like most other Web sites

D

5.

The menus on this Web site are confusing

D

6.

There is lots of help available on this Web site

D

7.

This Web site was designed with users in mind

D

8.

The Web site responded rapidly during navigation

D

9.

The Web site has a consistent "look and feel" throughout

D

10.

I like the colors on this Web site

N

11.

The terminology in the menus was familiar to me

N

12.

I always knew where to find what I was looking for

N

13.

I found ONLY the information I was looking for and nothing else

N

14.

I had to go through too many menus to find what I was looking for

N

15.

The search mechanism on the Web site was helpful

N

16.

It takes too long to find something on this Web site

N

17.

I sometimes wonder whether I'm clicking on the right menu

N

18.

The Web site provides the opportunity to discover information

N

19.

The menus on this Web site are logically constructed

N

20.

Navigation of this site was problematic

N

21.

The categories on the main page provided a guide to the information I needed

N

22.

I unexpectedly found useful information on this Web site

N

23.

It was easy to find specific information on this Web site

O

24.

I needed help finding what I needed on this Web site

O

25.

Using this Web site was fun

O

26.

I felt frustrated using the Web site

O

27.

It took too long to find what I wanted on this site

O

28.

I felt lost on this Web site

O

29.

Going to the library makes me uncomfortable

O

30.

I enjoyed using this Web site

O

31.

I would recommend this Web site to my colleagues

O

32.

I felt tense using this Web site

O

33.

There was too much jargon in the menus

O

34.

Using the library, in general, can be frustrating

O

35.

I'll definitely use this site in the future


D = Design, N = Navigation/discovery, O = Overall feeling

Step 2: Navigation/Usability Tasks

In order to observe how people use the Web site, and to determine navigational problems associated with the site, we set up a number of tasks for users to perform. The tasks were all searches for information known to exist on the Web site. For this study, 314 undergraduate students in various sections of LIS 1001 (Library Research Methods and Materials) were asked to perform searches. Each student was assigned two tasks, and students who completed the assignment received extra credit in the course. Because the assignments were distributed during the first week of class (before the were taught about the Web site in class), the students completing the tasks represented a cross section of undergraduates from a variety of departments. The variance in the distribution of each task is due to how the assignments were handed out--in some classes, they were not shuffled before handing out, so that more students received the same assignments. Table 2 shows the breakdown of tasks, the number of students involved, and the minimum number of navigational moves necessary to complete each task. Each subject was asked to document each navigational move they made during their attempt to complete their assigned tasks. A navigational move was defined as a click on a hyperlink.

Table 2. Navigation Tasks Assigned to Subjects


Task number

Task

Number of potential subjects

Minimum number of moves to complete task

1

How does one reserve one of the library's electronic classrooms?

20

2

2

Who is the Dean of Special Collections?

21

2

3

What is Sigrid Kelsey's e-mail address?

36

1

4

Find what Y2K resources are at LSU Library

20

1

5

Find a page containing databases related to art

20

1

6

Find a link to the Medline database

37

1

7

Find a link to Project Muse

43

2

8

Find the phone number to call to renew a book

43

1

9

Find the call number to Vanity Fair by William Makepeace Thackeray

37

1

10

What time does the Design Library close on weekdays?

37

2

 

Total

314

 

Note: The difference in the number of subjects assigned to each task is due to the fact that subjects were drawn from several sections of a class (LIS 1001 (Library Research Methods and Materials)). Each section varied considerably in number of students enrolled. Subjects are "potential" subjects because participation was not compulsory. Rather, extra credit in the course was the only inducement offered for participation.

Analysis of these tasks proceeded on two conventions. First, the starting point of each task was the LSU Libraries Web Site. Therefore, any navigation necessary to get to the LSU Libraries Web site was not counted as a move necessary to complete a task. For example, if a subject documented that they went to the main LSU Web site first and then used three clicks to get to the LSU Libraries Web site, and then proceeded to complete the assigned task, the first three navigational moves were not counted. The second convention used to analyze navigation data was that finding the link to the desired information constituted completion of the task. That is, clicking on the link to the page containing the desired information was not counted in the total navigational moves necessary to complete a task.

Another consideration in analyzing navigational moves was the minimum number of moves necessary to complete a task. In a Web site of any complexity, there may not be one right way to retrieve a known information item. In fact, on most of the assigned tasks, we identified more than one way to navigate to the desired information. Therefore, it is more appropriate to talk about the optimum or minimum number of moves. In some cases, it was possible to navigate various paths from the libraries' home page to the desired page and still complete the task in the minimum number of moves.

Finally, in analyzing subjects' navigation of the site, we used the coding scheme shown in table 3. Each move documented by subjects was coded according to the scheme presented in table 3, and totals for each were counted. This phase of the study was designed to address research questions 4 and 5 dealing with navigational processes and whether users were successful.

Table 3. Coding Scheme Used to Analyze Tasks Assigned to Subjects


Code Abbreviation

Code name

Description

I

Incorrect

Any navigational move that will not directly lead to the specified known item

B

Back one page, or Home

Clicking any "back" or "home" navigational device

C

Correct

Any navigational move leading directly toward the specified known item

O

Offsite move

Using a source outside of the LSU Libraries Web site to find the desired information

Results

The results of this study are presented in the order that they address the research questions. Results are also presented to reflect the stages specified in the research design. The usability survey in step one addressed the research questions "Who is using the LSU Libraries' Web site?", "For what purposes do users consult the LSU Libraries' Web site?", and "What impressions about the site do its users have?" The task analysis in step two addressed the research questions In what ways do they navigate the site? How successful are users when attempting to find desired information on the LSU Libraries Web site?

Step 1: Survey Results

First, we will show what types of users use the LSU Libraries Web site, for what purposes they use it, and what impressions they have about it. In total, we had 129 responses to the survey, but many of the subjects had to be deleted because they did not respond to a majority of the items on the survey. We made the determination that if a respondent left only two or three items blank, we would tally his or her responses. Therefore, the totals in table 4 (indicated under "n") are not always the same, but the range among those totals is only three (mean = 64, median = 64, mode = 63, standard error = 0.4).

Table 4. Survey Results Presented in Raw Numbers and As a Percentage for Each Item


 

 

 

Rating*
   
Percentage*

Type86

Item87

Question

1

2

3

4

5

n

Mean

1

2

3

4

5

D

2

I like the menu system on this Web site

14

18

16

9

12

69

2.81

20

26

23

13

17

D

3

This Web site is slow

16

12

15

20

6

69

2.83

23

17

22

29

9

D

4

The color combinations on this Web site should be changed

10

4

16

18

21

69

3.52

14

6

23

26

30

D

5

This Web site "behaves" like most other Web sites

11

19

24

5

9

68

2.74

16

28

35

7

13

D

12

The menus on this Web site are confusing

14

10

18

15

7

64

2.86

22

16

28

23

11

D

13

There is lots of help available on this Web site

8

7

28

11

10

64

3.13

13

11

44

17

16

D

20

This Web site was designed with users in mind

10

21

16

6

11

64

2.80

16

33

25

9

17

D

23

The Web site responded rapidly during navigation

9

20

19

9

7

64

2.72

14

31

30

14

11

D

26

The Web site has a consistent "look and feel" throughout

12

21

19

5

5

62

2.48

19

34

31

8

8

D

35

I like the colors on this Web site

14

16

15

7

9

61

2.69

23

26

25

11

15

N

1

The terminology in the menus was familiar to me

15

23

16

4

11

69

2.61

22

33

23

6

16

N

6

I always knew where to find what I was looking for

7

13

12

17

18

67

3.34

10

19

18

25

27

N

7

I found ONLY the information I was looking for and nothing else

4

6

25

9

22

66

3.59

6

9

38

14

33

N

9

I had to go through too many menus to find what I was looking for

20

16

12

9

10

67

2.60

30

24

18

13

15

N

11

The search mechanism on the Web site was helpful

10

18

18

7

11

64

2.86

16

28

28

11

17

N

16

It takes too long to find something on this Web site

15

13

12

14

9

63

2.78

24

21

19

22

14

N

17

I sometimes wonder whether I'm clicking on the right menu

20

12

13

10

9

64

2.63

31

19

20

16

14

N

19

The Web site provides the opportunity to discover information

18

19

14

3

9

63

2.46

29

30

22

5

14

N

22

The menus on this Web site are logically constructed

6

26

17

8

6

63

2.71

10

41

27

13

10

N

25

Navigation of this site was problematic

10

15

16

13

8

62

2.90

16

24

26

21

13

N

31

The categories on the main page provided a guide to the information I needed

10

22

11

10

9

62

2.77

16

35

18

16

15

N

32

I unexpectedly found useful information on this Web site

9

15

23

8

8

63

2.86

14

24

37

13

13

N

34

It was easy to find specific information on this Web site

7

18

18

10

9

62

2.94

11

29

29

16

15

O

8

I needed help finding what I needed on this Web site

14

15

25

12

0

66

2.53

21

23

38

18

0

O

10

Using this Web site was fun

8

7

23

8

20

66

3.38

12

11

35

12

30

O

14

I felt frustrated using the Web site

18

10

8

14

14

64

2.94

28

16

13

22

22

O

15

It took too long to find what I wanted on this site

18

11

12

12

10

63

2.76

29

17

19

19

16

O

18

I felt lost on this Web site

10

15

12

13

13

63

3.06

16

24

19

21

21

O

21

Going to the library makes me uncomfortable

9

6

14

5

29

63

3.62

14

10

22

8

46

O

24

I enjoyed using this Web site

8

17

19

5

14

63

3.00

13

27

30

8

22

O

27

I would recommend this Web site to my colleagues

13

16

18

8

8

63

2.71

21

25

29

13

13

O

28

I felt tense using this Web site

9

6

18

15

16

64

3.36

14

9

28

23

25

O

29

There was too much jargon in the menus

10

4

19

17

12

62

3.27

16

6

31

27

19

O

30

Using the library, in general, can be frustrating

18

9

15

13

8

63

2.75

29

14

24

21

13

O

33

I'll definitely use this site in the future

26

17

9

1

10

63

2.24

41

27

14

2

16

                             

* Items were rated from 1 (strongly agree) to 5 (strongly disagree).

86 D = design-related question, N = navigation/discovery question, and O = overall feelings.

87 The numbers in the Item column are the order in which the items appeared on the survey as presented to respondents.

The respondents represent a self-selected group because we solicited responses in the form of a request posted on the Web site itself. In the next section, we describe the respondents' demographics.

Who is using the LSU Libraries' Web site, and how are they using it?
Before discussing the results of our survey, we must state the following caveats. First, the respondents were not selected at random. Rather, they simply responded to a request on the Web site to fill out the survey. As such, respondents were self-selected. Second, and owing to the fact that respondents were not under any sort of control or obligation to the researchers, there is no way to assure that respondents were being honest or accurate in their responses. Third, the respondents heavily represent the undergraduate population, so any deviation from the mean is, in essence, a deviation from the undergraduate mean. In any case, our goal with the survey was to get a general feeling about how the users of this Web site respond to using it. It is, therefore, in the spirit of gaining baseline data that we proceeded with the survey.

In a three-month period, seventy-four respondents filled out the survey. Out of the seventy-four respondents, thirty-two (43 percent) were undergraduate students, fifteen (20 percent) were not affiliated with LSU, fourteen (19 percent) were graduate students, six (8 percent) were faculty, and five (7 percent) were staff. Two (3 percent) did not report their status. There were not enough responses in each respondent class to determine if differences were real or biased. We did observe some tendencies, which should, of course, be taken as possible future areas of research and not as conclusive results. For example, faculty were more likely than students to disagree with the statement, "The Web site provides the opportunity to discover information." Faculty agreed at a higher rate than students that the site was jargon laden. However, since there were only six responses from faculty, these results are tentative at best.

Nevertheless, the purposes for which the site was used were somewhat varied (see the summary in Table 5). Of all respondents who indicated their primary purpose for visiting the site (55 in all), 24 were just browsing, 24 were working on a specific assignment, and 5 were working on a dissertation or thesis. Most of the undergraduates (28 indicated purpose) used the site for assignments (17) and the rest used it simply for browsing (11). The 14 graduate students who indicated use were almost evenly divided among just browsing (4), assignments (5), and dissertation/thesis (5). Those not affiliated with LSU and indicating use (6) were all browsing. Only one faculty member and two staff members reported browsing and one staff member reported working on an assignment.

Table 5. Purposes for Which Users Visited the LSU Libraries Site


Purpose

Undergrad

Grad

Prof.

Staff

Not Affiliated

Blank

Total

Assignments

17

5

--

1

--

1

24

Browsing

11

4

1

2

6

--

24

Dissertation

0

5

--

--

--

--

5

Research

0

--

1

--

--

--

1

Work

0

--

--

1

--

--

1

Total

28

14

2

4

6

1

55

What impressions about the site do its users have?
The results varied greatly, from comments such as this one, from an English professor:

I consider the LSU Library Home Page "impossible." It is highly inefficient. There are too mahy [sic] fine distinctions of primarily administrative interest. Especially perplexing is the huge list of possible interfaces that one receives when logging on from outside the library or campus. It might be acceptable if some of these interfaces worked, but, alas, they are all either impossibly slow or crash the host system (or crash it in virtue of being slow and timing it out). I am an experienced computer user and have used electronic library catalogues all over the world and have never seen such a miasma. Please simplify the system and arrange it intuitively rather than on the basis of the arrangements you have with software and database vendors. I have been working with this system now for 45 minutes trying to get a single citation and have now given up in order to TELNET to other remote libraries using my own independent browser.

There were also more positive comments, such as the following, from a graduate student in library and information science:

I feel that this is a fine attempt towards the ideal of a totally intutative [sic] and very user-friendly Web site. I am very, very pleased at the efforts laid out in this cyber-document. Please continue to maintain it.

Survey respondents rated their level of agreement with statements about the site's navigability and usability, its design, and their overall feelings about using it. Table 4 summarizes the results of these rankings.

The Web survey generated some basic idea about how people searched the Web site and what they did and didn't like. Because the nature of Web sites is dynamic, the Web site did change over the timespan in which the Web survey was offered. However, no drastic changes in the look and feel or organization of the Web site were implemented during this time. While this survey gave us some general ideas about user attitudes toward our Web site, it did not provide any specific information unless the respondents commented in such a way. Because of the Likert scale, the survey recorded general feelings but not the reasons behind the answers. Moreover, many users filled out some answers when it was clear that their main objective was to request a book or complain about the facilities. Nevertheless, the information from the survey gave us direction in formulating questions for the second phase of the study; for example, 48 percent of the respondents agreed (chose 1 or 2) that it took too long to find what they were looking for on the Web site.

Step 2: Task Analysis

We wanted to see how the Web site's users navigated in order to find information they wanted. To do so, we constructed a set of tasks for users to accomplish. Users were instructed to find specific, known items in the site. In all, there were ten tasks given to various sets of users, with each user receiving two tasks. The variance in the number of users completing various tasks is due in part to uneven distribution. In some of the classes, the assignments were not properly shuffled before handing them out. Nevertheless, the percentages in the results give a clear idea of which tasks were completed with the greatest success. Because the menu system on this site is quite broad as opposed to deep, it was possible to assign tasks that could be accomplished in one or two moves. Therefore, it was easy to identify when items were found in an optimal way. There were times when subjects found alternate means of getting to the known items using the optimal number of moves, but these cases were rare. Ordinarily, a user needed to take one path to the known item in order to get there using the minimal number of moves. Moves are defined as the point at which a subject made use of a hyperlink to move from one point in the Web site to another.

Subjects recorded each move, correct or incorrect, and the researchers coded the moves using a simple scheme: C = correct move; I = incorrect move; B = clicking to go back a screen; and O = offsite browsing. An example of offsite browsing was when subjects used the LSU directory, or a 411 service, to find an e-mail address, as opposed to using the LSU Libraries' Web site to do so. They found the information, and while that is not strictly incorrect, we were interested in the libraries site's navigation system, not in the subject's ability to find information on the Web at large.

These categories were derived by analyzing samples of moves, and were not decided upon a priori by the researchers. This scheme allowed us to:

compare patterns of moves among different tasks;

compare patterns of moves within tasks; and

compare optimal moves with observed moves

In what ways do users navigate the site?
We began the task analysis by looking at various comparisons of moves within and among tasks. In all, we observed 772 moves by all subjects across all tasks. Total moves within each task and for each category within tasks are shown in table 6. In part, the data shown in table 6 address the question, "What percentage of move types were observed for each task?" Here, we see that the task to find a link to Project Muse appears to have been particularly troublesome. Forty-three subjects made 159 moves in completing the task, an average of 3.7 moves to accomplish what could have been done in 2 moves. Interestingly, the same subjects performed the second most efficient task (1.81 moves per subject) on the telephone renewal question. Therefore, finding the link to Project Muse was difficult. Perhaps the difficulty was in the fact that the students did not know what Project Muse was and, consequently, did not know where to begin looking.

Table 6. Moves by Category and Task, with Percentages of Moves within Task Categories


 

 

Incorrect
Move Back/home
Correct
Off-site
Total

Task

No. of subjects

No.

%

No.

%

No.

%

No.

%

No.

%

Reserving electronic classroom

14

11

23

7

15

29

62

0

0

47

100

Dean of Special Collections

21

11

16

9

13

50

71

0

0

70

100

Sigrid Kelsey's e-mail

36

6

10

5

8

27

44

23

38

61

100

Y2K resources at LSU

20

11

30

8

22

17

46

1

3

37

100

Find art databases

20

21

30

11

15

39

55

0

0

71

100

Find link to Medline

37

16

17

10

11

64

67

5

5

95

100

Link to Project Muse

43

28

17

17

11

81

50

33

20

159

100

Find number for phone renewal

43

10

13

3

4

65

83

0

0

78

100

Call number to Vanity Fair (book)

37

6

8

8

11

58

77

3

4

75

100

Time design library closes

37

2

3

1

1

72

91

4

5

79

100

Total

308

122

16

79

10

502

65

69

9

772

100

In addition, table 6 shows that 65 percent of all moves were correct moves. This probably speaks well for the Web site in general. However, the percentage of correct moves ranged from 91 percent (for the design library closure time task) to 44 percent (for the e-mail task). It is unclear what this large range suggests, but it is probably due, at least in part, to the fact that efficient navigation relies on language to guide users to needed information. Language is notoriously ambiguous, and users will have problems making navigation choices; therefore much thought should go into labeling design. 16

One point of interest in the data shown in the following tables is that two of the tasks, Sigrid Kelsey's e-mail and Project Muse, instigated a high percentage of off-site searches by users. We will see that Project Muse was particularly troublesome for subjects in other ways, but the e-mail address task was not otherwise difficult. Subjects may have been familiar with other services from which e-mail addresses could be found, and immediately sought out those resources. An additional point of confusion may have been that the users were not aware that Sigrid Kelsey is a libraries' employee, even though she introduced herself while distributing the assignments.

Another question we might ask about how users navigate the site is, "Of all moves observed across all tasks, what percentage occurred in each category?" This analysis will enable us to see which tasks show unusually high occurrences in each category. Table 7 addresses this question. The Project Muse, art databases, and Medline tasks accounted for 53 percent of all incorrect moves, 23 percent, 17 percent, and 13 percent respectively. All other tasks accounted for percentages ranging from 2 percent to 11 percent. However, since more than twice as many subjects were assigned to the Project Muse task compared to the art databases task (forty-three and twenty respectively), the latter seemed to have led subjects astray more often.

Table 7. Categories and Tasks Expressed As a Percentage of Moves within Categories across Tasks


 

 

Incorrect

Move Back/home

Correct
Off-site
 

Task

No. of subjects

No.

%

No.

%

No.

%

No.

%

Total

Reserving electronic classroom

14

11

9

7

9

29

6

0

0

47

Dean of Special Collections

21

11

9

9

11

50

10

0

0

70

Sigrid Kelsey's e-mail

36

6

5

5

6

27

5

23

33

61

Y2k resources at LSU

20

11

9

8

10

17

3

1

1

37

Find art databases

20

21

17

11

14

39

8

0

0

71

Find link to Medline

37

16

13

10

13

64

13

5

7

95

Link to Project Muse

43

28

23

17

22

81

16

33

48

159

Find number for phone renewal

43

10

8

3

4

65

13

0

0

78

Call number to Vanity Fair (book)

37

6

5

8

10

58

12

3

4

75

Time design library closes

37

2

2

1

1

72

14

4

6

79

Total

308

122

100

79

100

502

100

69

100

772

The percentages of correct moves were relatively evenly distributed among the tasks, with the exception of the reserving electronic classroom, Sigrid Kelsey's e-mail, Y2K resources, and the art databases tasks. These four tasks accounted for only 22 percent of all correct moves, and in each of these tasks, correct moves were equal to or below the percentages of incorrect moves for those tasks. These percentages are another indicator that subjects may have had an unusual amount of difficulty with these tasks, and may point to areas in which the Web site may need redesign

How successfully do users navigate the site?
Obviously, it is difficult to compare the ten tasks as equals. Two factors complicate the issue. One is that the tasks themselves are different. The other is that the same groups did not do all ten tasks--each participant had only two tasks. That being said, we can make some comparisons by calculating the average moves per task made by each subject. This analysis will show whether there are vast differences among tasks regarding each task's difficulty. It will also provide a measure of the efficiency with which subjects accomplished each task. In table 8, we show data that rank the efficiency with which subjects accomplished each task, measured as a ratio of correct moves to the sum of incorrect and back moves (C/(I+B)) in each task. That is, we combined back moves with incorrect moves under the assumption that back moves were the result of some mistake, real or perceived, that inclined the subject to start over.

Table 8. Ratio of Correct (C) to Incorrect (I) Moves and Back (B) Moves


Task

N

Correct/incorrect or back

Avg. correct/incorrect or back

Difference

Time design library closes

37

24.00

2.51

21.49

Find number for phone renewal

43

5.00

2.51

2.49

Call number to Vanity Fair (book)

37

4.14

2.51

1.63

Dean of Special Collections

21

2.50

2.51

-0.01

Find link to Medline

37

2.46

2.51

-0.05

Sigrid Kelsey's e-mail

36

2.45

2.51

-0.06

Link to Project Muse

43

1.80

2.51

-0.71

Reserving electronic classroom

14

1.61

2.51

-0.90

Find art databases

20

1.22

2.51

-1.29

Y2K resources at LSU

20

0.89

2.51

-1.62

Total

308

     

 

Of note is the fact that the design library closure question was by far the most efficient search by our measure, a 24 to 1 ratio of correct to incorrect and back moves. The average ratio across all tasks was 2.51. The rest of the tasks' ratios were much closer to the average, and seven of the ten tasks were at or below the across task average. Being below average, by our measure, means that the search was less efficient, demonstrating a tendency for subjects to make fewer correct moves or for them to make more incorrect and back moves. Finding art databases and Y2K resources were the least efficient, according to this measure. Once users went off the libraries' site, the circumstances changed, so we did not include off-site moves in this formula.

Another way to examine the efficiency with which subjects navigated the site is to ask, "How many moves, on average, did each subject make during a task?" That is, we address the issue of how closely the subjects were able to approach the optimum number of moves for each task. In addition, this question normalizes the effects of varying numbers of subjects in each task. As in table 8, we used table 9 to rank the tasks from most efficient to least, and similar results occurred, but not exactly the same. It should be remembered that this measure was based purely on how many moves were made per subject, and not what kind. Therefore, efficiency by this measure is closely linked to the optimal number of moves for each task. The measure on which table 8 is sorted is moves-per-subject minus optimal number of moves. Table 8 shows that the fewest number of moves-per-subject was done finding Sigrid Kelsey's e-mail address, and the most were spent on finding the link to Project Muse. None of the tasks were accomplished, on average, in the optimal number of moves, although individual subjects did routinely do tasks using optimal strategy.

Table 9. Comparison of Within-Task Differences between Moves per Subject and Mean Moves per Subject


 

N

Total

Column A:
Avg.moves/subj.
within task

Column B:
no. of moves/task

Column A minus Column B

Column C: Optimal subj. across tasks

Column A minus Column C

Sigrid Kelsey's e-mail

36

61

1.69

2

-0.31

2.51

-0.81

Call number to Vanity Fair (book)

37

75

2.03

2

0.03

2.51

-0.48

Find number for phone renewal

43

78

1.81

1

0.81

2.51

-0.69

Y2K resources at LSU

20

37

1.85

1

0.85

2.51

-0.66

Time design library closes

37

79

2.14

1

1.14

2.51

-0.37

Find art databases

20

71

3.55

2

1.55

2.51

1.04

Find link to Medline

37

95

2.57

1

1.57

2.51

0.06

Link to Project Muse

43

159

3.70

2

1.70

2.51

1.19

Dean of Special Collections

21

70

3.33

1

2.33

2.51

0.83

Reserving electronic classroom

14

47

3.36

1

2.36

2.51

0.85

Total

308

772

2.51

       

It is illuminating to compare tables 8 and 9 as efficiency measures. For example, the e-mail question was about an average C/(I+B) ratio, and yet, it was the most efficient in terms of average moves per subject. At the same time, the design library closure time task was highest on the C/(I+B) ratio and dropped to the middle of moves per subject ranking. This is due to the fact that these are two different measures of efficiency. One is measuring correct versus incorrect moves in each task, and the other is measuring the average moves that subject made during each task. It is recommended that analysts should refer back to the raw numbers (see tables 6 and 7, both of which express raw counts) to ascertain why differences occur. In the case of the e-mail and design library questions, it should be remembered that subjects only made a total of two incorrect moves and one back move, while making seventy-two correct moves. The e-mail question's subjects made six incorrect, five back, and twenty-seven correct moves. This indicates that, when considering redesign questions, the part of the navigation system that leads to e-mail addresses should be examined more closely than that which leads to hours of operation for branches.

Discussions and Conclusions

This study was conducted to find out more about how well a particular Web site's navigation system works for various people doing various tasks. We conducted a two-phased study in which: (1) users were surveyed regarding their opinions and feelings about the Web site in general and its navigation system in particular; and (2) users were assigned tasks to test the Web site's navigability. Two measures of navigation efficiency were developed in the course of the study: (1) correct to non-correct moves for tasks; and (2) average moves to optimum moves comparison. We found that some parts of the site's navigation system worked better than others. It should be noted that this study indicates what parts of a site's navigation system should be investigated and what parts might be left alone. This is an import_ -ant question when studying a site's information architecture.

The two phases of our study helped identify problems, and may be used in the analysis of other sites. The first phase identified areas in which there may be problems. For example, the quote that stated dissatisfaction with the site provided us with specific information about sources of problems in the site. In addition, while the survey items suggested that users were generally satisfied with the site, there were also indications that some parts of the site needed attention. For example, consider the following items and average results (on a 1 to 5 scale, 1 = strongly agree, 5 = strongly disagree)

(item 6) I always knew where to find what I was looking for (4.2)

(item 10) Using this Web site was fun (4.3)

(item 13) There is lots of help available on this Web site (4.0)

(item 34) It was easy to find specific information on this Web site (3.6).

Items 6, 10, and 34 are directly related to site navigation. These items do not give overwhelming support to the notion that the site is easy to navigate. Item 10 is mentioned only as another way to subjectively evaluate the overall effectiveness of the site.

Based on the results of the survey and assignment, several changes were made to the Web site. Approx_ imately three hundred out-of-date files were removed so that they no longer show up in the results of a Search this Site search. A dynamic database listing all the databases, indexes, e-journals, and subject guides now makes it easier for patrons to find these resources by topic or title. Moreover, cross-referencing links to e-journals of a topic appear if a user searches for databases under that topic. This not only makes finding a database easier for patrons, it streamlines the maintenance of the Web site.

We were correct in adding an A to Z list in the initial redesign--it was heavily used during the user tests. A problem with the initial A to Z list that became apparent during the users tests and surveys was that it did not contain enough links. More than two hundred headings were added to the A to Z list, making it more comprehensive than before.

Our study provides a practical basis for further Web site studies, whether on the LSU Libraries' site, or other sites. To determine the final success of the study, further research is needed to determine whether ratings of the Web site and the success rate of searching for known items are higher than in our initial study.

Acknowledgements

The authors would like to thank Liz Shaw and Don King, both on faculty at the University of Pittsburgh, for taking the time to read and comment on the paper. Their suggestions made the paper stronger.


References

   1. American Library Association, User Access to Services Committee, RUSA Machine-Assisted Reference Section, "User's Information-Seeking Behavior: What Are They Really Doing? A Bibliography," Reference and User Services Quarterly 40, no. 3 (spring 2001): 240-50.

   2. Somporn Puttapithakporn, "Interface Design and User Problems and Errors: A Case Study of Novice Searchers," RQ 30, no. 2 (winter 1990): 195-204.

   3. Tim Bucknall and Rikki Mangrum, "U-Search: A User Study of the CD-ROM Service at the University of North Carolina at Chapel Hill," RQ 31, no. 4 (summer 1992): 542-53.

   4. Michael Culbertson, "Analysis of Searches by End-Users of Science and Engineering CD-ROM Databases in an Academic Library," CD-ROM Professional 5, no. 3 (Mar. 1992): 76-79.

   5. Lara D. Catledge and James E. Pitkow, Characterizing Browsing Strategies in the World Wide Web. Accessed Apr. 1, 2002, www.igd.fhg.de/archive/1995_www95/proceedings/papers/80/userpatterns/UserPatterns.Paper4.formatted.html.

   6. Laurel A. Carter, "Building a Better Reference Interface," Internet Reference Services Quarterly 3, no. 4 (1998): 57-84.

   7. Jerilyn R. Veldof, Michael J. Prasse, and Victoria A. Mills, "Chauffeured by the User: Usability in the Electronic Library," Journal of Library Administration 26, no. 3/4 ( 1999): 115-40.

   8. Jakob Nielsen, Usability Engineering (San Francisco: Morgan Kaufmann, 1994); R. Wise, The Three Click Rule, Dec. 15, 2000. Accessed Apr. 1, 2002, www.website-owner.com/articles/design/3clickrule.asp.

   9. Jakob Nielsen, The Top Ten New Mistakes of Web Design, May 30, 1999. Accessed Apr. 1, 2002, www.useit.com/alertbox/990530.html.

   10. Louis Rosenfeld and Peter Morville, Information Architecture for the World Wide Web (Cambridge, Mass.: O'Reilly, 1998).

   11. David Robins, "Information Architecture, Organizations, and Records Management," Records and Information Management Report 17, no. 3 (2001): 1-14.

   12. J. Dietrich, K. Gordon, and M. Wexler, Effects of Link Arrangement on Search Efficiency. Accessed Apr. 1, 2002, www.otal.umd.edu/SHORE/bs09/index.html#toc.

   13. John Morkes and Jakob Nielsen, Concise, Scannable, and Objective: How to Write for the Web. Accessed Apr. 1, 2002, www.useit.com/papers/webwriting/writing.html.

   14. Nielsen, Usability Engineering.

   15. Human Factors Research Group, Ireland. Software Usability Measurement Inventory (SUMI). Accessed Apr. 1, 2002, www.ucc.ie/hfrg/questionnaires/sumi/index.html.

   16. Rosenfeld and Morville, Information Architecture for the World Wide Web.


   David Robins ( drobins@pitt.edu) is Assistant Professor at the University of Pittsburgh School of Information Sciences; Sigrid Kelsey ( skelsey@lsu.edu) is Electronic Reference Services and Web Development Coordinator at Louisiana State University Libraries, Baton Rouge.