Thinking beyond the Disjunctive Opposition of Information Literacy Assessment in Theory and Practice

Terrance S. Newell, Doctoral Student, School of Library and Information Studies, University of Wisconsin, Madison

School library literature has illuminated several structural barriers (such as lack of time and role perception conflict) that impede many school library media specialists (SLMSs) from fulfilling their student assessment role in practice, and specialists have identified technology as a mediating artifact that could aide in an expansion of that role. The purpose of this study is twofold: (1) to interrogate a middle school’s information literacy assessment system for internal disturbances (tensions and contradictions) between system elements that restrict the SLMS’s ability to assess information literacy; and (2) to design a technology-based instrument that addresses these system disturbances and enables SLMSs to expand their assessment roles. The researcher used rapid design ethnography coupled with an activity theory (AT)-based analysis to produce an indepth description of the assessment system disturbances. A technology-based assessment instrument was then designed and developed to address the system disturbances. The ethnographic methods and AT analysis informed the design and development of a Virtual Reality Information Literacy Learning and Assessment Space (VILLAS). VILLAS addresses the four primary assessment disturbances illuminated in recent library literature in addition to the site-specific disturbances identified in this study, and it has the potential to decrease the distance between the librarian’s assessment roles as envisioned in theory and as realized in practice.

Purpose of the Study | Theoretical Approach | Review of the Literature | Research Method | Data Analysis | Assessment System Disturbance Themes | Innovative Assessment Design Requirements | Transformative Technology Used | Assessment System Evolution | Conclusion | Works Cited

The assessment of student learning and performance is a current challenge for many school librarians attempting to prepare students to navigate, evaluate, and use diverse print and electronic environments. School library scholars (Kuhlthau 1994; Jackson 1993; Neuman 1993; Stripling 1993) helped initiate an expansion of the assessment discourse in school librarianship beyond testing towards alternative and authentic assessments. Their works coupled with seminal pieces such as Information Power: Building Partnerships for Learning (AASL and AECT 1998) aided in the construction of the present theoretical and epistemological substratum of assessment in school librarianship. In theory, SLMSs work with other educators in planning, teaching, and assessing student learning. Within these collaborative environments, SLMSs exert their leadership in a variety of ways—for example, by sharing effective information literacy, teaching, and assessment strategies with other educators or by directly assessing student learning. Theoretically, the level of involvement in planning, teaching, and assessment should be no less than a partnership with other educators. However, the current practice of the instructional (including assessment) and curriculum roles in many school library media centers does not reflect the present theoretical and epistemological expectations of assessment in school librarianship articulated in professional literature and national guidelines (Callison 1995; Jones 1997; Kinder 1995; Lewis 1990; Pickard 1993; Bishop 1996; Stoddard 1991; McCarthy 1997). Thus, many SLMSs are experiencing a disjunctive opposition or profound conflict between information literacy assessment in theory and as practiced (Jones 1997; Person 1993; Pickard 1993); in fact, SLMSs are practicing the roles of instructional, assessment, and curricular leader or partner less often than any other role articulated in national guidelines (McCracken 2001). Recent library literature has illuminated the existence of assessment system impediments (for example, lack of time, role perception conflict, lack of teacher interest in cooperation, and too many students to serve) to the practice of information literacy assessment that restricts many SLMSs’ ability to assess student learning. These assessment system impediments and their causal issues demand new approaches to information literacy assessment. Either the new approaches must transform existing alternative assessment techniques to complement the teaching environment and culture of school library media centers, or the SLMS must construct entirely new assessment techniques that harmonize with their skill domain, library culture, and working environment.

[ Back to Top]

   Purpose of the Study

The broad move from the dominance of testing to that of alternative assessment and the existence of structural impediments to the practice of alternative assessment in school library media centers are producing assessment system conflicts that restrict many SLMSs’ ability to assess information literacy. The purpose of this study is twofold: (1) to interrogate a middle school’s information literacy assessment system for internal disturbances (tensions and contradictions) between system elements that restrict the media SLMS’s ability to assess information literacy; and (2) to design a technology-based instrument that addresses these system disturbances and enables SLMSs to expand their assessment roles. The researcher undertakes this effort in the belief that the student assessment role of specialists in theory and as practiced are not incommensurable in spite of existing impediments (for example, role perception conflict), and that SLMSs may be correct in their identification of technology as an instrument that could help them expand their instructional and assessment roles (McCracken 2001).

[ Back to Top]

   Theoretical Approach

This study is concerned with the transcendence of dualisms and conflicts between assessment theory and practice within information literacy assessment systems. Therefore, the researcher employs an activity theory (AT)-based perspective because it provides a theoretical lens through which levels of analysis are applied upon an activity system for the purpose of transcending dualisms and conflicts (Engestrom 1999). AT focuses upon the object-oriented, artifact-mediated, collective activity system as its unit of analysis without overly focusing on either the individual subjects or the broad system (Engestrom 1999). “Minimum elements of a system include the object, subject, mediating artifacts (signs and tools), rules, community, and division of labor” (Engestrom 1999, 9) (see  figure 1).

Object refers to the object of activity or the socially distributed and collective purposes of activity within a system. “An object can be a material thing, but it can also be less tangible (such as a plan or totally intangible (such as a common idea) as long as it can be shared for manipulation and transformation by the participants of the activity” (Kuuti 1996). All activity systems attempt to transform their objects into outcomes, which is the materialization of the object-oriented activity. Subjects are agents that participate in activity towards the object, and they combine to form a community of agents in collective activities, such as information literacy assessment. The idea of artifact mediation is central to activity theory. According to the theory, artifacts (such as rules, division of labor, and mediating tools) mediate the subject(s) and community’s transformation of the object into desired outcomes. Artifacts can range from physical tools, such as technological instruments, to nonphysical tools like language, procedures, and methods. It is possible, using AT, to interrogate the internal disturbances (tensions and contradictions) between system elements and to construct new artifacts that make possible its transformation.

[ Back to Top]

   Review of the Literature

This review is a specific analysis of the literature that created the theoretical underpinnings for a paradigm shift toward alternative assessments in school library media centers and the structural forces that impede their use. The literature review is divided into the following sections: (1) courtship between school library media centers and alternative assessment; (2) dominant types of alternative assessment in school librarianship; (3) successful implementation of alternative assessments; and (4) review summary.

Courtship between School Library Media Centers and Alternative Assessment

The term information use signals a paradigm shift in skill type from library skills to information skills, and it has intensified a courtship between school library media centers and alternative assessments. The shift from information location skills to information-processing skills created a new emphasis on developing transferable cognitive skills that should increase students’ effective use of information in general as well as their use of specific libraries and resources (Eisenberg and Brown 1992; Jackson 1993). The intensified courtship between school library media centers and alternative assessment also stems from the similarities between the philosophy of alternative assessment and the philosophy of school library media centers (Neuman 1993).

At a fundamental level, the philosophy of alternative assessment parallels that of school library media centers by emphasizing the use of information, complex thinking skills, and an ongoing nature. Both philosophies assume that using information—rather than simply possessing it—is the most important component of intellectual activity (Neuman 1993). Alternative assessments are exercises that ask students to demonstrate and use their knowledge and skills by undertaking some type of activity (Rothman 1996), and this idea of use is echoed by the mission of school library media programs to ensure that students and staff are effective users of ideas and information (AASL and AECT 1998; Neuman 1993).

Each philosophy also has continuity as a primary concern. As previously stated, the philosophy of alternative assessment articulates the idea that assessments should be ongoing, natural parts of students’ experiences, and this philosophy corresponds with one of the missions of school library media programs, which is to provide resources and activities that contribute to lifelong learning (AASL and AECT 1998; Neuman 1993). According to this idea of continuity, educators assess student progress along levels of continuous understanding and performance. The continuum extends from lower, more elementary knowledge, understanding, and skills to more advanced levels and these levels describe understanding in terms of qualitatively distinguishable performances along the continuum (Wilson and Adams 1996). SLMSs believe that the ability to find and use information is the keystone of lifelong learning, and they are committed to creating the foundation for such learning (AASL and AECT 1998; Neuman 1993). Furthermore, the entire process of information skills instruction assumes the ongoing and direct assessment of students’ efforts with the goal of constantly improving their skills (AASL and AECT 1998).

Another major idea embedded within the philosophy of alternative assessment is the encouragement of higher-order thinking skills. Alternative assessments encourage students to use higher-order thinking by requiring them to use their judgment, knowledge, and skills in complex environments or projects. This idea is echoed by the school library media center’s commitment to fostering higher-order thinking skills by providing students with opportunities to learn how to locate, analyze, evaluate, interpret, and communicate information and ideas (AASL and AECT 1998; Neuman 1993).

Due to the paradigm shift from library skills to information skills and the paralleling philosophies of alternative assessments and school library media centers, SLMSs are replacing traditional testing with assessment that is ongoing, open-ended, and in a real-life context (Stripling 1993). These alternative assessments provide a more complete view of information use, progress, and achievement.

Dominant Types of Alternative Assessment in School Librarianship

The term alternative assessment in school librarianship usually evokes the image of two systematic techniques that attempt to augment the use of personal contact in gaining information about student learning—performances and portfolios. Portfolios are purposeful collections of students’ work that illuminate their efforts, abilities, progresses, and understandings, and portfolios provide a complex and comprehensive view of student performance in context (Turner 1993; Paulson, Paulson, and Meyer 1991). Callison (1993) stated that school librarians naturally understand the characteristics of portfolio assessment because school librarians are increasingly practicing some aspect of activity documentation in respect to the library collection and the curriculum. He encouraged school librarians to move beyond the use of records collection for showing the value of the school library towards the use of portfolios as a method for student assessment.

Performance-based assessments are exercises that ask students to demonstrate their knowledge and skills by undertaking some type of performance (Rothman 1996). They provide a basis for teachers to evaluate both the effectiveness of the process or procedure used and the product resulting from performance of a task. These types of assessment usually require the student to demonstrate skills in a real-world environment or to complete a project by assuming the role of a real-life professional. These assessments according to some are beneficial because they provide real incentives, drive instruction and learning in positive ways, and focus learning on higher order or complex thinking skills (Madaus and Tan 1993; NCEST 1992).

Successful Implementation of Alternative Assessments

The successful implementation of alternative assessments in school library media centers is dependent on the construction of a proper learning environment by librarians and on assessment system constraints of the instructional and assessment roles. The field has historically focused on the construction of a proper learning environment when discussing successful implementation of alternative assessment. According to the historical discussion, alternative assessment will flourish in librarian-constructed learning environments that nurture the students’ complex thinking and reflection, establish assessment as a learning experience in itself, value students’ progress as well as their final achievements, and has an ongoing nature (Turner 1993; Zessoules and Gardner 1991). The learning environment must also promote collaboration, the use of multiple resources, ownership, and time flexibility (Neuman 1993).

Totally focusing upon the librarian-constructed environment during the successful implementation of alternative assessment discussion is quite dangerous. Such discussions render assessment system disturbances as nonexistent and fail to examine the innumerable cases in which SLMSs are willing to construct the proper learning environment and are still unable to assess student learning. Although much of the research literature does not directly address the assessment role of SLMSs, it is a role that is embedded within the instructional and curricular roles of the SLMS, and when these roles are impeded, so too is assessment. The literature has illuminated several system barriers, which impede many school librarians’ fulfillment of an instructional, curricular, and assessment role. These barriers are a lack of time, a lack of interest and support of classroom teachers, inadequate staff size, too many schools or students to serve, and misconceptions about the librarian’s role.

Time has been identified as a system deterrent to the development and full implementation of the instructional and curricular roles articulated in professional literature and national guidelines (e.g., Giorgis and Peterson 1996; Van Deusen and Tallman 1994; McCracken 2001). The degree of existence for all roles articulated by Information Power (1998) depends on a structure of time that allows those roles to come into being (Giorgis and Peterson 1996). This is reflected in studies that elicited written comments from librarians who felt overwhelmed by the number of roles theoretically implied and the lack of time for role development. For example, a study (McCracken 2001) that attempted to determine if practicing SLMSs perceive that they had been able to implement their roles as described in Information Power (1998), elicited such comments as “I wish I had the time to be the person your form reflects” and “Can anyone do everything on your form?” McCracken (2001) also identified several deterrents to the instructional and curricular role with time being the primary deterrent. The findings of the McCracken (2001) study support previous research that identified time as a major structural impediment to the instructional and assessment roles of SLMSs (e.g., Fedora 1993; Stoddard 1991; Ervin 1989; McCarthy 1997).

Research also shows that school administrators and teachers are putting into operation their misperception of the librarian’s role upon the SLMS through the creation of professional kinds (Dorrell and Lawson 1995; Ceperley 1991). In accordance with Hacking’s (1996) use of the term human kind, the term professional kind is used to emphasize a system of classification which attempts to identify a kind of library professional and their behaviors, conditions, tendencies, temperaments, and roles. The system of classification that administrators and teachers are currently using emerges from the school librarian’s traditional roles (Lai 1995). This system of classification has created misperceptions of the SLMS’s role and traditional expectations that are impeding role possibilities (Naylor and Jenkins 1988; Ervin 1989). These role restrictions are caused by looping effects, which occur when the misperceptions (e.g., SLMS as a steward of books) and expectations (e.g., librarian as a resource provider to the real educators) reenters the universe of everyday actions (Hacking 1996). Research shows that looping effects are occurring (Lai 1995; Ervin 1989). For example, McCracken (2001) stated, “library media specialists are prevented from taking a more active role in instruction because of the perceptions and expectations of teachers and principals” (8). Respondents to McCracken’s (2001) study stated, “The administration does not consider how important the library is to developing a solid and well-rounded curriculum that promotes maximum learning” and “Administrators think all we do is check out books” (8). Library literature such as Hauck and Schieman (1985) shows that the primary misperception of the SLMS’s role is that of instructional, assessment, and curriculum leader.

Researchers have shown that misperceptions of the SLMS’s role by teachers manifest themselves in educational practices, and affect the interest and support that teachers give in respect to the instructional/assessment and curriculum role of SLMSs (e.g., Lewis 1990; Jones 1997; Kinder 1995; Pickard 1993; McCracken 2001). Participants in McCracken (2001) study frequently made comments such as:

“most teachers have a traditional view of the librarian’s role—it is challenging to get them to view me as a teacher, too,” “teachers aren’t eager to collaborate—or don’t always see the need to do library projects,” and “teachers are set in their ways and do not want to cooperate.” (8)

These misperceptions provide the foundation for a community of instructional/assessment practice that marginalizes SLMSs on the periphery of teaching, learning, and assessment, then the full implementation of an assessment role by SLMSs is unlikely to occur.

SLMSs are also severely out numbered. They serve more students than any other educator in the school, and the number of students coupled with inadequate staffing has been indicated as an impediment to fulfilling an instructional/assessment role (McCracken 2001; Kinder 1995).

Summary

The field of school librarianship has made a broad move from the dominance of testing to that of alternative assessment. The theoretical and epistemological substratum for this shift was constructed by the professional literature and such seminal guidelines as Information Power: Building Partnerships for Learning (1998). Successful implementation of alternative assessment in school library media centers is dependant on the librarian-constructed environment and on assessment system constraints to the instructional/assessment role. The field has historically focused on the librarian-constructed learning environment, rendering assessment system constraints as secondary. Although much of the research literature does not directly address the assessment role of the SLMS, it is a role that is embedded within the instructional and curricular roles of the SLMS, and when these roles are impeded, so too is assessment. The literature has illuminated four major impediments to the SLMS’s attempt to assess student learning. First, many SLMSs do not have time to fulfill an instructional/assessment role; therefore, they cannot assess students on a consistent basis. Second, teachers are situating SLMSs on the periphery of teaching, learning, and assessment, thus obstructing the full implementation of an assessment role by SLMSs. Third, school administrators and teachers are implementing their misperception of the librarian’s role upon the SLMS and impeding role possibilities. Fourth, the number of students served by school library media centers coupled with inadequate staffing makes ongoing alternative assessment a great challenge.

These system impediments and their causal issues demand new approaches to information literacy assessment. Either the new approaches must transform existing alternative assessment techniques to complement the teaching environment and culture of school library media centers, or SLMSs must construct entirely new assessment techniques that harmonize with their skill domain, library culture, and working environment.

[ Back to Top]

   Research Method

This is a design research study. The purpose of design research is to produce bounded systems (e.g., a system of assessment) or transformative interventions constructed upon an in-depth knowledge of the social circumstances in which they are deployed and used (Sperschneider and Bagger 2000). What distinguishes design research from traditional forms of inquiry is not the methods used to gather information, but rather the unique purpose of the inquiry, to produce a culture-specific instrument or system. Since the early 1990s, researchers and designers have increasingly used ethnography as a research tool in the system design and system transformation processes, and the researcher uses it in this study because it has the potential to provide great insight into the subtleties and tensions involved in the assessment of student learning.

In particular, this study employs rapid design ethnography, which is simply the use of ethnographic methods to produce a culture-specific instrument or system within a short space of time (three months for this study).

Research Design

This rapid design ethnography utilized three data collection procedures: (1) direct observation; (2) interviews; and (3) document analysis. The researcher used direct observation as a method to obtain data on the librarian-constructed teaching and assessment environment, collaborative teaching and assessment environment, and system factors impeding the instructional/assessment role of the SLMS. The observation process included the unobtrusive recording of the librarian, teachers, and students’ behaviors and interactions in respect to the particular focus of study. Observational data was recorded using field notes.

Interviews were used to obtain insights on the SLMS’s philosophical and practical perspective of teaching and assessment, the library’s culture, routine independent and interdependent teaching and evaluative activities, students’ perspectives of learning and assessment, ideal assessment vision, and system deterrents to assessing information literacy. The researcher also interviewed teachers to obtain insights on teaching and assessing information literacy, students’ perspectives of teaching and assessment, and their perceived role of the librarian in respect to instruction and assessment. All interviews were semi-structured, which allowed for further questioning based on participants responses. All interviews (with one exception) were audio recorded.

Document analysis or data gathering through the collection of existing documents was also employed during the rapid design ethnography. The researcher collected any documents such as activities used during teaching, lesson plans, student final projects, school newsletters, and tests that would provide insight into the problem studied.

Selection of Site and Participants

This study attempts to use technology as an instrument that could potentially aid in the fulfillment of the instructional/assessment role impeded by system disturbances in many school library media centers. Thus, technology is the basis for the study’s selection frame. The school district chosen for this study has a very innovative Internet program housed within eleven of its middle school media centers and computer labs. The researcher used the Internet program as a primary selection tool because it was assumed that any media center with the technological foundations and staff training to house an innovative Internet program for a number of years should have the foundation to implement and maintain an innovative assessment instrument. As stated above, eleven school media centers (all situated in middle schools) are presently participating in the program. The Internet program schools were numbered, and the researcher used a random number table to select four potential sites. East middle school was the first site selected with West second, North third, and South fourth. The researcher then informally interviewed the SLMS at each site to determine if the specialist supported learning environments that allow the characteristics of alternative assessment to flourish, and if the SLMS was interested in the project. East middle school was selected as the research site because it was the best technology fit, the SLMS and school administration were very interested in the study, school staff supported the idea of alternative assessment, and it coincidentally was the first site randomly selected. The researcher used purposeful sampling to select the SLMS because there is usually only one per site in middle school settings, and the study employed convenient sampling to select classroom teachers.

Description of Participants

One SLMS participated in the study. She has a master’s degree in library studies and more than ten years of experience. The researcher recruited three teachers for the study, and they all agreed to interviews. The three teachers directly recruited for the study teach the following subjects: eight-grade history, seventh-grade science, and seventh-grade social studies. The researcher also observed five classes (in the library setting) throughout their completion of classroom projects that utilized the library.

Description of Site

East media center operates within a middle school located in a small Midwestern city. The mission of the media center parallels that of the school, which is to educate all students with the knowledge, skills, and confidence required to participate in a global society. The fulfillment of this mission, to educate for participation in a global society, is based primarily on two guiding principles: (1) thematic, integrated, and multicultural curricula; and (2) technology integrated throughout the curriculum. The school is one of the district’s technology showcase schools. It has a student-to-computer ratio of four to one due to its two computer labs, a mobile laptop lab, the library’s computer center, and fully networked classroom computers. The school actively attempts to ensure that educators are using the technology across the curriculum to meet the school’s mission, instead, of only using the technology for planning and documentation.

The school library media center has one part-time SLMS and one part-time library aide. The SLMS has a master’s degree in library studies and more than ten years of experience, and this is the aide’s first year working in a library setting. In an effort to keep the school library media center open five days per week, the SLMS and the aide alternates days. Thus, the specialist and the aide are not usually present at the same time. The media center serves 240 students with 78 percent of the student body being culturally and racially diverse. The library also serves the most racially and culturally diverse teaching staff in the district.

[ Back to Top]

   Data Analysis

Following a perspective from cultural-historical AT, the researcher focused on the analysis of assessment system disturbances identified in the transcriptions and the field notes. A primary objective in an analysis employing an activity system perspective is to describe the activity system elements (i.e., subject, object, mediating artifacts, rules, community, labor division, and outcome) and relationships, and to interrogate the disturbances within or between the elements (Collins, Shukla, and Redmiles 2002). In this ethnography, analysis included an interrogation of the assessment system disturbances and their insights toward an assessment instrument that could revolutionize the assessment system. Analysis began during the data collection phase. The researcher documented this initial analysis in field notes as activity theory-based thoughts. After the three-month data collection period, the researcher read the field notes and transcripts several times and identified overall system disturbance themes.

   Presentation and Analysis of Data

The researcher employed an AT perspective that incorporated levels of analysis utilized in the Engestrom (1999) activity system model (see  figure 1). In remaining true to this perspective, the researcher used Engestrom’s activity system model to present the findings of the study (Collins, Shukla, and Redmiles 2002).

According to the activity system model, there are specific primary elements in an expanded activity system (i.e., subject, object, outcome, rules, community, division of labor, and mediating artifacts). According to Engestrom’s model, these are the characteristics of East’s information skills assessment system.

Object

The object of the assessment system is to ensure that each student meets the information literacy standards of the district by eighth grade.

   Outcomes

All activity systems attempt to transform objects into outcomes. The assessment system under study is motivated to transform its object into the outcomes listed in  table 1.

Subjects

The SLMS and teachers are the primary subjects in this assessment system. They are predominantly responsible for information learning and assessment.

Community

The assessment system is theoretically a collective activity that transforms teachers and the SLMS into a community of agents working towards a shared object. Within the system, the SLMS and teachers should collaborate in the teaching and assessment of information skills during an introductory technology course, curriculum-based classroom projects, and during library-based projects.

Mediating Artifacts

The idea of artifact mediation is central to AT. According to the theory, artifacts (e.g., rules, division of labor and mediating tools) mediate the subject(s) and community’s transformation of the object into desired outcomes. Artifacts can range from physical tools, such as technological instruments, to nonphysical ones, such as procedures and methods.

   Division of Labor

The division of labor among the community of agents emerges from the cultural-historical meaning of being a certain type of professional, teacher, or SLMS. As previously stated, the assessment system is theoretically a collective activity that transforms teachers and the SLMS into a community of agents working towards a shared object. However, each agent or groups of agents have specific actions and roles to perform. Theoretically, the librarian’s role is to “work with teachers to plan, conduct, and evaluate learning activities that incorporate information literacy” (AASL and AECT 1998, 50).  Table 2 summaries the Sketch School District (an identifier created by the author for the school district from which East was selected) East Middle School, and Information Power’s (1998) vision of the SLMS’s role, and the teacher’s role, which is prescribed predominately by the cultural-historical structure of the educational system.

Tools

Within this system of assessment, subjects theoretically use assessment tools to ensure that each student meets the information literacy standards. According to the school’s own primary documents, the information literacy assessment tools should be collections of students’ work that illuminates their efforts, abilities, processes, and understandings or exercises that ask students to demonstrate their knowledge and skills.

[ Back to Top]

   Assessment System Disturbance Themes

This section focuses on the expanded information literacy assessment system disturbances identified in this study. Articulated in this section are the disturbance themes between activity system elements that emerged from the rapid design ethnography.

Disturbance Themes between Expanded Assessment System Elements

Theme 1: Subjects versus Object

The object of the assessment system is to ensure that each student meets the information literacy standards of the district by eighth grade. The study identifies three disturbances between teachers and the object. The first disturbance presents itself as a misconception of information literacy by teachers. Teachers consistently conceptualized information literacy or library and information skills as the ability to locate information within the physical library. Although the district’s information literacy standards are very explicit in articulating the desired student competencies of information seeking, access, evaluation, use, and communication, teachers could only focus upon location skills. A social studies teacher displayed the misconception of information literacy assessment as the assessment of library location skills in the following words, which echoed the sentiments of other teachers interviewed:

If I want to do a project, the first time I come down, the librarian will usually give a tutorial of how to use the library. Where the encyclopedias are, almanacs, magazines, books and which section you will find those books in. That would be really about all that the librarian directly does as far a preteaching and then once the project begins, they are more like assistant teachers. Just there to help the kids navigate the library. Do we try to check and see where each kid is at in if they understand how to use the library? Its really more of a in-the-course of the project. Trying to figure out who is lost and who is not lost and then helping them. We are fortunate here in that the library is not overwhelming in it’s size and what it offers. Here it is pretty easy to say, here are the magazines, here are the encyclopedias. . . . And we just sort of figure out on the fly what they know and what they don’t know. They don’t have like a pretest on how to use information. In my experience, the kids know how to use the library. They are pretty good at it.

The second disturbance emerged as a dissonance of objects. The object of ensuring that each student meets the standards of the core content areas overshadows the object of ensuring information literacy. Instead of becoming a duality of object, ensuring that students are information literate are, at most, afterthoughts during the curriculum-based classroom projects, as articulated in statement such as:

We don’t really do a lot of research. The thing is that it takes so much time to do research, and I have so much material to cover”, and statement from the media specialist such as, “the teacher usually has his agenda or her agenda when coming into the library.

The third disturbance is ambivalence toward the object. Teachers often articulated opposing attitudes toward assessing information literacy. They acknowledged that the district had information literacy standards, but they were uncertain about the need of the object. They illustrated ambivalence in the following types of comments:

No, we don’t have a way of assessing information skills, but by now I think that students would know how to use the library. Um, if a student needs help during some type of activity then they could always ask for help. But, no. We don’t have a way of knowing what every student knows.
And
I think that a lot of these resource books here in the middle along with the encyclopedias, atlases, and like resource books about presidents and animals, kids might be poorer at using those to get their information, and are more likely to go to the card catalog and then find books that tie into the topic that they are looking for, but the information that they are looking for is going to require them to really scan through the book and glean a very small amount of information. It makes it more difficult then if they would know how to use the larger resource books. . . . They don’t have like a pretest on how to use information. In my experience, the kids know how to use the library. They are pretty good at it.

Theme 2: Subjects versus Division of Labor

The system also experienced disturbances between subjects and their perceptions of labor division. Theoretically, the librarian’s role is to “work with teachers to plan, conduct, and evaluate learning activities that incorporate information literacy” (AASL/AECT 1998, 50). However, teachers predominantly did not see the role of the librarian as partner in the teaching and assessing of information literacy. Teachers related their feelings about the teaching and assessment roles of librarians in comments that included the following:

Well, we can handle the teaching. The librarian should focus on collecting materials and scheduling. I have all of the resource that I need, but more images would be nice. And, if I could get my resources in electronic form, then that would be great.
And
I would prefer the latter [a resource provider role] with an emphasis being on organization. With this grant and the equipment being brought in, knowing where it is and being very responsible for keeping its way about the known schedule of when the library would be available and being very firm in following up on that schedule and not letting people just come in randomly. And, it would not be a bad idea, early on in the year, for the librarian to require each class so that all students are down here for a day or through the course of a week to give a tutorial on how to use it and to explain some really simple things like checking out and renewing books and what’s expected as far as checking out and renewing books if you are using the computers. Where the fiction and non-fiction is and how it is organized, that would be handy.

In respect to the perceptions of the librarian’s role during the labor division in assessing and teaching information literacy, the SLMS stated:

I am not satisfied with what goes on here in regard to that. I really feel, and again I think it is communication you know, realizing what is the role of the librarian. Do teacher’s understand that, no, I think often they think that the role of the librarian is to order books, check books in and out, keep up the library, to help when classes come in, teach some research skills. But, beyond that I don’t think that they see it as a full partnership, which it should be. We all are here to teach these kids and help them to grow and learn, and really there should not be any borders between us, and um, the reality is of course your busy ordering books and cleaning up and doing all the things you need to do to maintain the library. However, I think a really good librarian should look at it as a partnership with the teacher, and a really good teacher should look at the librarian as a partner.
She continued in saying,
I also think that the school of education needs to collaborate more with the school of library science because I don’t think teachers understand what librarians are doing. Because I have taught, I have an understanding of what they are doing.

Theme 3: Assessment Tools versus Object

Within this system of assessment, the information literacy assessment tools should be ongoing, integrated parts of class projects. These tools may be collections of students’ work that illuminates their efforts, abilities, progresses, and understandings or exercises that ask students to demonstrate their knowledge and skills. The assessment tools present two major disturbances to a system attempting to ensure that each student meets the information literacy standards of the district. The first is an unreliability of assessment across teachers with misconceptions of information literacy, core content area accountability, and ambivalence toward the object. The misconceptions, core-content focuses, and ambivalence make the rating of information efforts, abilities, progresses, understandings, and use an uncontrollable variable. The second is the time-consuming nature of collecting students’ work that illuminates their efforts, abilities, progresses, understandings, and use, or conducting exercises that ask students to demonstrate their knowledge and skills. These collections (e.g., portfolios) and demonstrations require a significant amount of time to develop learning outcomes and rating criteria and to allow students to collect evidence or demonstrate skills. One teacher explained it in the following way:

We don’t really do a lot of research. The thing is that it takes so much time to do research, and I have so much material to cover. Next year I am going to do more of certain things and less on others, and unfortunately they [students] are going to miss out on early America so I can focus on later America. The research takes a long time to do.
The SLMS stated that a major barrier to teaching and assessing information literacy is time. Her response included the following:
I am part-time. . . . When I am here, I’m am here and I am so busy here that I don’t have time to go to the classroom. So, that is a problem. I would like to ideally see more of the teachers come into the library and plan it with me, but that is difficult because we are under the constrains of time.

Theme 4: Community versus Object

The assessment system is theoretically a collective activity that transforms teachers and the SLMS into a community of agents collaborating in the teaching and assessment of information skills during classroom projects and during library-based projects. The location of these agents emerged as a system disturbance. Location is a crucial factor affecting collaboration, use of library resources during class-based projects, and the role of the librarian as information teacher and assessor. For instance, the SLMS remarked:

I don’t like the way our school is built, two floors. I feel that the upstairs is more isolated from the LMC, an out of sight out of mind sort of thing. The teachers are too busy for them to come down here so it’s a track. They tend to send their kids down with notes to me, but I don’t see them as much, and I can’t get up there because who’s going to cover the library when I’m gone. So, there are a lot of issues in regards to that. So, that’s a problem. I don’t like two-story schools for that reason. I think that libraries should be very visible and very central.
She continued in stating that,
The library does not occupy a central location in the school; thus, it is used most by sixth-grade teachers because they are closest to it.

A seventh-grade teacher described the conflict in the following way: “project time, logistically, is just very difficult with four periods throughout the day, trying to get all of the kids down into the library.”

Theme 5: Subject versus Community

The school theoretically employs an expanded assessment system composed of seven primary elements (subject, object, outcome, rules, community, division of labor, and mediating artifacts). The expanded system theoretically transforms teachers and the SLMS into a community of agents working towards a shared object. However, in practice the disturbances present within the assessment system transforms it into a closed system composed of only four elements (subject, assessment tools, object, and outcome) in which the SLMS is the primary subject working towards the object. Within this qualified system of assessment, there are also disturbance themes.

The first disturbance theme emerges between the subject and object. The object of the assessment system is to ensure that each student meets the information literacy standards of the district by eighth grade, but the SLMS is the only information professional on site serving a student population of 240 students. Therefore, the assurance that each student is information literate is complicated, and when coupled with: (1) teacher misconceptions of information literacy and the librarian’s role, (2) dissonance of objects, (3) ambivalence toward the object, (4) time-consuming nature of class projects, and (5) the library location, the disturbance magnifies.

The second disturbance theme emerges between the subject and tools. The subject employs two major tools in assessing information literacy—personal contact and authentic projects. During the three months of observations, the researcher recorded the SLMS’s use of personal contact during five class projects that utilized the library. Personal contact, if used effectively can provide valuable information about student learning. Effective use required the SLMS to observe and evaluate the progress of students to identify needs. During the interview, the SLMS said:

because the teachers will come here and all of the sudden they decide that they are going to do a project, and so they will come into the library to do it, and I will walk around to see how the kids are doing and I will help them on an individual basis if I see that they are in need . . . basically, identifying their needs, it is up to me to walk out, which I do when they’re working and then check on them. They have a very comfortable relationship with me, so they will come to me all the time. They come to me. They even come to me when they’ve graduated from here.

The librarian’s accounts were not entirely consistent with the researcher’s observations of her activity during class projects. During four of the five class projects that took place during the field research portion of this project, the librarian was observed sitting at the circulation desk waiting to be approached by students for assistance. For example, a seventh-grade geography class entered the library and the teacher stated, “We are looking for pictures. The pictures should represent your country, and must be incorporated into your finished product.” The classroom teacher sat at a table, and the librarian positioned herself at the front desk, appearing available for questioning. The teacher told the students to use the Internet to find pictures. Three students decided to use print images and went immediately into the stacks, but they found nothing. One of the students searching for printed images went to the catalog. After using the catalog, he returned to his seat with nothing. He then began to roam around the room, and then went back to his seat. The other two students used the catalog, reentered the stacks, and returned with nothing. Two of the three students then asked the librarian for help, and she actively aided the students by taking them directly to the resource needed. The third student appeared quite frustrated, and returned to the computer area without asking the librarian for help.

Many of the computer users used search terms repeatedly to find images, receiving many hits. Students consistently asked, “Is this a good picture?” and they did not seem to know how to evaluate visual information. None of the computer users asked the librarian for help, and she did not circulate around the class observing, asking questions, and evaluating. The personal contact assessment tool does not seem to work in this school library media center because the librarian (1) usually does not know the goals and objectives of the class projects, (2) is not viewed as a partner in information teaching and learning, (3) content domain is not valued by many teachers, and (4) does not see all students during the year.

The librarian also employs authentic projects to assess information literacy. In these authentic assessments, students are required to demonstrate information skills in authentic real-life contexts. The school library media center’s extensive body of existing documents illustrates its platform of authentic assessment. For example, students conducted research (which often included calling local organizations and interviewing subjects) on child abuse, the risks associated with smoking, and being biracial, and then they created video presentations of their findings for the school district’s local television channel. Students also made books-on-tape for the remedial reading class, and a Hmong cookbook to educate the student population on one aspect of Hmong culture. These authentic assessments are very effective because they are exercises that ask students to demonstrate their knowledge and skills, and they provide collections of students’ work that illuminates their efforts, abilities, processes, and understandings. However, they also present two major disturbances to the object. The first disturbance is the fact that only the active members of the book club (never more then ten students in the projects analyzed) participate in these projects; therefore, not all students are assessed. The second disturbance is the time commitment involved with these projects. Each project takes several months due to the nature of authentic alternative assessments and the challenges associated with getting students out of their classes.

[ Back to Top]

   Innovative Assessment Design Requirements

The researcher identified several design requirements from the analysis of the expanded system disturbance themes and the teaching/assessment cultural contours of the closed media center system. The design requirements emerging from the expanded assessment system disturbances include:

  • Subjects versus Object: The innovative assessment instrument must address the misconception of information literacy as the ability to locate information, and it must transcend the physical school library. It must also bring balance to the core content and information objects, and address teacher’s ambivalence toward assessing information literacy.
  • Subjects versus Division of Labor: The instrument must augment the librarian’s role of teacher/assessor.
  • Assessment Tools versus Object: The design must create assessment reliability across classrooms and core content areas, and address the time-consuming limitations of alternative assessments.
  • Community versus Object: The design must rectify the problems of collaboration and use created by library location.
  • Subject versus Community: The design must attempt to expand the library’s closed assessment system.

The design requirements emerging from the examination of the teaching/assessment cultural contours of the closed media center’s system includes:

  • Retaining the authentic alternative assessment focus that requires students to demonstrate information literacy in real-life contexts
  • Ensuring that all students are able to participate in the authentic projects by expanding participation beyond the active members of the book club to all 240 students, and alleviating the problems of getting students out of class
  • Addressing the issue of time (i.e., the librarian’s part-time status, time-consuming nature of authentic alternative assessment, and time to fulfill other duties)

[ Back to Top]

   Transformative Technology Used

SLMSs have identified technology as an instrument that could potentially aid in the fulfillment of an expanded teaching and assessment role (McCracken 2001). The researcher of this article is proficient with many interactive computer-mediated authoring tools (e.g. desktop presentation, hypermedia, HTML, multimedia, video game, and virtual reality authoring tools), and he intended to use them all in the construction of a transformative technology-based artifact. However, the melding of these technologies into a transformative artifact was shaped by the rapid design ethnography. The ethnography illuminated very strong themes of transcending information spaces, undesired temporal and spatial boundaries, authentic assessment, and learning in real-life contexts, which situated virtual reality (VR) as the dominant technology to be used in the technology-based artifact; moreover, the fact that VR is a meta-medium (able to support all other existing media) influenced its selection as the foundational technology used. “Virtual reality (VR) is a term referring to computer-based technologies ranging from sophisticated 3-D simulations to full-immersion experiences in which the participants find themselves in a highly interactive, multi-sensory, artificial environment so vivid that it appears real” (Focier 1999, 300). The researcher used desktop VR, which allows participants to examine 3-D simulations from every perspective through a computer screen and navigate the simulated environment with a control device such as a mouse or a keyboard (McLellan 1992; Focier 1999). This type of VR also provides a first-person experience and it is not very expensive, making it highly valuable for library and classroom settings.

VR enables people to process information more easily, provides an experience of information that is dynamic and immediate, and creates an environment for problem solving (McLellan 1992). VR is an emerging technology, and its educational implications are still being realized and researched. Researchers are realizing that VR has great educational potential as: (1) a feedback and data-gathering instrument (Hamilton 1992; McLellan 1994; McLellan 1992); (2) an experimental learning instrument (McLellan 1992); (3) a tool for nontraditional learners (McLellan 1992); and (4) as an instrument for evaluating performance (Boman, Piantanida, and Schlager 1993).

[ Back to Top]

   Assessment System Evolution

   The rapid design ethnography coupled with an activity theory-based analysis of data that focused upon assessment system disturbances informed the design of a Virtual Information Literacy Learning and Assessment Space (VILLAS) (see  figure 2). VILLAS is a combination of several different technologies (inscribed within a virtual reality platform), information literacy concepts, and learning and assessment strategies to address an assessment system’s disturbances and to complement its pedagogical cultural contours. This section briefly explores design aspects of VILLAS, the system disturbances that they address, and its potential to transform the assessment system by moving beyond the disjunctive opposition of student assessment in theory and as practiced.

Design Aspect 1: Authentic Problem-based Assessment

The design of VILLAS is founded upon the idea of authentic problem-based assessment, which is the practical application or demonstration of skills and knowledge in a real-world context to solve problems. The real-world context is designed into the platform using virtual reality and authentic performance-based tasks. VR allowed the researcher to create 3-D interactive information environments (including 3-D libraries, homes, museums, books, computers, televisions, and people) that appear real. Within these virtual information environments, students can move and interact freely and collaborate with each other using avatars (virtual representations of themselves), chat features, and gestures. The realistic contexts make problems more engaging and helps educators assess how students think, reason, and use information to problem solve across context.

The authentic problem-based assessment tasks are designed to assess information literacy during the most realistic tasks possible that addresses the district’s standards. The tasks inscribed within the platform ranges from well-structured problems (e.g., location and differentiation) to less-structured problems where problem identification, collection, organization, integration, evaluation, and use of information are emphasized. Most tasks also require students to use the virtual information environments, make observations, collect and analyze data, and use different types of equipment.

This design aspect brings balance to the core content (e.g., social studies) and information literacy objects by creating a middle ground. Teachers in the core content areas are constantly attempting to decrease the distance between content and real life, and it is easier for the librarian to construct real-life information problems than information-infused core content problems; therefore, a balance is created that focuses upon the authentic application of content. This design aspect also frees the librarian from the marginalized teaching and assessment position created from role misconceptions and teacher’s unwillingness to collaborate by freeing her from having to possess a profound knowledge of core content areas when constructing assessment tasks. Instead of depending on every teacher to construct ongoing class and library projects or depending on uncooperative teachers to allow her to play a role in learning and assessment, the librarian can take a leadership role in constructing assessments that do not take place in a vacuum.

The teacher, if willing, could also collaborate with the librarian to identify the main core concepts that students are studying, and the librarian could then construct authentic problems or tasks that contain relevant concepts and principles for a specific core content area, which are solved within virtual information environments.

Design Aspect 2: Information Spaces

A primary idea designed into VILLAS is that school library media centers are preparing students to navigate, evaluate, and use diverse print and electronic environments, some directly located within or linked to institutions and others independent from time and place constraints. This idea directly addresses teacher’s misconceptions of information literacy assessment as the location of information within a particular library, and it also addresses the librarian’s concerns of skill transference across information environments.

VILLAS is designed to be the virtual counterpart of this information as spaces reality. This intricate 3-D, desktop computer-based simulation platform is designed to surround the student, making her feel as if she is immersed in virtual information spaces. In particular, the spaces consist of the middle school library, a high school library, informal information environments (e.g., homes), and electronic environments. For every physical object of the real world, there is a virtual counterpart in the virtual space (e.g., working computers, real-time collaboration, books, stacks, tables, televisions, and people). These virtual objects and the virtual environment itself are interactive and respond to the participant’s actions (McLellan 1992). In some ways, the virtual platform is even better than reality: we can stop it, replay it, and stretch the limits of real information spaces (Forcier 1999).

Aspect 3: Elasticity of Temporal and Spatial Boundaries

Present within the design of VILLAS is the idea that controlling time and space could increase the degree of existence for the librarian’s teaching and assessment role. This idea addresses the time-consuming limitations of alternative assessments, librarian’s part-time status, time to fulfill other roles, decentralized location of the library, and the scarcity of library space.

VILLAS will be located on a password-protected server and accessed from the school’s two computer labs, mobile laptop lab, library’s computer center, and the networked classroom computers using a Web browser plug-in. This remote access feature frees the SLMS, teachers, and students from the constraints of time and location. Students can demonstrate their information skills in 3-D virtual information spaces while remaining in the classroom or library, and teachers, students, and information specialists can stop and continue the assessment as needed.

This aspect also frees time for the librarian to fulfill other responsibilities. It does this because the librarian has several ways to assess in respect to time and space. The assessment can take place: (1) in the library with the librarian circulating and assessing demonstrations physically; (2) in the classroom with the librarian assessing demonstrations via an avatar; or (3) during a periodic examination of the collections of students’ work, that illuminates their efforts, abilities, progresses, and understandings.

Aspect 4: Triangulation of Rating

Rating triangulation is a design aspect of VILLAS that directly addresses the unreliability of assessment across teachers with misconceptions of information literacy, core content focuses, and ambivalence toward the information literacy object, which makes the rating of information efforts, abilities, progresses, understandings, and use an uncontrollable variable. The triangulation elicits multiple perspectives (teacher, student, and SLMS’s) on the student’s achievement of a set of characteristics or qualities evaluated. The rating scale, a combination of numerical, graphic, and descriptive graphic scales, is used to assess learning outcomes and student development. A numerical rating scale is simply a scale that raters use to indicate the degree to which a characteristic is achieved by marking a number. Graphic rating scales indicate degree of characteristic achievement along a horizontal continuum, and descriptive graphic rating scales use descriptive phrases to identify degree of characteristic achievement along a horizontal a graphic scale. The rating scales are used to assess a variety of information literacy learning outcomes and aspects of student development. They are also designed to assess processes, procedures, and products.

Many VILLAS tasks require students to demonstrate achievement through their performances. The rating scales bring reliability to the assessment of information literacy demonstrations across classrooms by assessing the same aspects of performance in all students on a common scale. Many of the performances results in some type of product, and the raters can assess the product as well as the process and procedure. The rating scale serves the same purpose in product assessment that it does in process assessment. It helps raters assess the products of all students in terms of the same characteristics.

[ Back to Top]

   Conclusion

The rapid design ethnography coupled with the AT-based analysis identified several design requirements from the analysis of the expanded assessment system disturbances and the teaching and assessment cultural contours of the closed media center system. The expanded assessment system analysis supported the four major impediments to the SLMS’s teaching and assessment role illuminated in school library literature (i.e., time, lack of teacher-librarian cooperation, misperception of librarian’s role, and number of students to serve). However, the examination of this particular assessment system also identified: (1) a misconception of information literacy by teachers; (2) a dissonance of objects; (3) teacher ambivalence toward information literacy assessment; (4) unreliability of assessment across teachers; (5) the time-consuming nature of alternative assessment; and (6) the location of the library as a factor impeding role implementation. The illumination of these internal disturbances between system elements informed the creation of a technology-based artifact with the potential to transform the system enabling SLMSs to implement the assessment role implied in recent library literature. The technology-based artifact is VILLAS, which is a combination of several different technologies (e.g., VR), information literacy concepts, and learning and assessment strategies to address an assessment system’s disturbances and to complement its pedagogical cultural contours. The mediating artifact, VILLAS, is an authentic problem-based assessment that brings balance to the dissonance of core content and information objects, and frees the librarian from the marginalized teaching and assessment position created by role misconception and unwillingness of teacher collaboration. It is the virtual counterpart of our information as space reality, which addresses teacher’s misconceptions of information literacy assessment as the location of information within a particular library, and it addresses the librarian’s concerns of skill transference across information environments. The artifact attempts to control time and space to increase the degree of existence for the librarian’s teaching and assessment role, and it employs rating triangulation to directly address the unreliability of assessment across classrooms. These system specific design aspects have the potential to transform the assessment system by moving information educators beyond the disjunctive opposition between student assessment in theory and as practiced.

By focusing on one bounded assessment system, the researcher allowed himself the opportunity to examine the system in-depth from various angles to get a rich, meaningful, and complete understanding of the particular complex system under investigation. This study was not concerned with the typicality of the site or system because the general purpose was to understand this particular system’s disturbances and to implement a process that could potentially produce a technology-based mediating artifact to transform the system. However, due to the consistencies between this particular system’s disturbances and those illuminated in school library literature, qualified inferences and applications of this mediating artifact maybe made or applied in similar assessment systems.

[ Back to Top]

   Works Cited

American Association of School Librarians (AASL) and Association for Educational Communications and Technology (AECT). 1998. Information power: Building partnerships for learning. Chicago: ALA.

Bishop, B. 1996. Design and development of an interactive, multimedia product that prepares preservice teachers to use the library media center program. Ed.D. diss., Univ. of Houston.

Boman, D., T. Piantanida, and M. Schlager. 1993. Virtual environment systems for maintenance training. Final Report 1–4. Menlo Park, Calif.: SRI International.

Bricken, M. and C. Byrne. 1993. Summer students in virtual reality: A pilot study on educational applications of virtual reality technology. In Virtual Reality: Applications and Explorations, ed. A. Wexelblat, 199–218. Boston, Mass.: Academic Pr.

Callison, D. 1993. The potential for portfolio assessment. In Assessment and the school library media center, ed. C. C. Kuhlthau, 121-130. Englewood: Colo.: Libraries Unlimited.

———. 1995. Restructuring preservice education. In School library media annual,ed. B. J. Morris, 100–112. Englewood, Colo.: Libraries Unlimited.

Ceperley, P. E. 1991. Information needs 2000: Results of a survey of library media specialists. Charleston, W.V.: Appalachia Educational Lab. ERIC Document ED 340 393.

Collins, P., S. Shukla, and D. Redmiles. 2002. Activity theory and system design: A view from the trenches. Computer-Supported Cooperative Work 11, no. 1–2: 55–80.

Dorrell, L. D., and L. V. Lawson. 1995. What are principals’ perceptions of the school library media specialist? NASSP Bulletin 79, no. 2: 72–80.

Engestrom, Y. 1999. Perspectives on activity theory. Cambridge, U.K.: Cambridge Univ. Pr.

Eisenberg, M. B., and M. K. Brown. 1992. “Current themes regarding library and information skills instruction: Research supporting and research lacking.” School Library Media Quarterly 20, no. 2: 103–11.

Ervin, D. 1989. The effect of experience, educational level, and subject area on the philosophical acceptance, the perceived assumption, and the perceived barriers to implementation of the instructional and curricular role of the school library media specialist. Doctoral dissertation, Univ. of South Carolina. Dissertation Abstracts International 50(09A), 2767.

Fedora, A. P. 1993. An exploration of the scheduling patterns of two exemplary elementary school media centers. Ph.D. diss., Univ. of North Carolina.

Focier, R. C. 1999. The computer as an educational tool: Productivity and problem solving. Columbus, Ohio: Merrill.

Giorgis, C. G., and B. Peterson. 1996. Teachers and librarians collaborate to create a community of learners. Language Arts 73, no. 8: 477–82.

Hacking, I. 1996. The looping effects of human kinds. In Causal cognition: A multidisciplinary approach, ed. C. Sperber, D. Premack, and J. Premack, 351–83. Oxford: Oxford Univ. Pr.

Hamilton, J. 1992. “Virtual reality: How a computer-generated world could change the world.” BusinessWeek (October 5): 96–105.

Hauck, P. and E. Scheiman. 1985. The role of the teacher-librarian in Alberta schools. ERIC Document ED 262 788.

Jackson, M. 1993. Library information skills and standardized achievement tests. In Assessment and the school library media center, ed. C. C. Kuhlthau, 25–32. Englewood: Colo.: Libraries Unlimited.

Jones, A. C. 1997. An analysis of the theoretical and actual curriculum development involvement of Georgia school library media specialists. Doctoral dissertation, Georgia State Univ. Dissertation Abstracts International 58(08A), 2890.

Kinder, S. J. 1995. Teacher-librarians’ perceptions and priorities in regard to elementary school library programs and services. Masters thesis, University of Regina, Canada.

Kuhlthau, C. C. 1994. Assessment and the school library media center. Englewood, Colo.: Libraries Unlimited.

Kuuti, K. 1996. Activity theory as a potential framework for human-computer interaction research. In Context and consciouness: Activity theory and human-computer interaction, ed. B. A. Nardi. Cambridge, Mass.: MIT Pr.

Lai, Y. 1995. The attitudes of public elementary school teachers and school library media specialists in three east Tennessee counties toward the instructional consultant role of the school library media specialist. Doctoral disseratation, Univ. of Tennessee. Dissertation Abstracts International 56(08A), 2986.

Lewis, C. G. 1990. The school library media program and its role in the middle school: A study of the perceptions of North Carolina middle school principals and media coordinators. Ed.D. diss., Univ. of North Carolina at Chapel Hill.

Madaus, G., and A. Tan. 1993. The growth of assessment. In Assessment and the school library media center, Ed. C. C. Kuhlthau, 1–19. Englewood: Colo.: Libraries Unlimited.

McCarthy, C. A. 1997. A reality check: The challenges of implementing information power in school library media programs. Research and professional paper presented at the annual conference of International Association of School Librarianship, Vancouver, B.C., July 6–11. ERIC Document ED 412 958.

McCracken, A. 2001. School library media specialists’ perceptions of practice and importance of roles described in Information Power. School Library Media Research 4. Accessed Jan. 5, 2004,  www.ala.org/ala/aasl/aaslpubsandjournals/slmrb/slmrcontents/volume42001/mccracken.htm.

McLellan, H. 1992. Virtual realities. Accessed Nov. 5, 2002,  www.Aect.org/Intranet/Publications/edtech/15/index.html.

National Council on Education Standards and Testing (NCEST). 1992. Raising Standards for American education: A report to congress, the secretary of education, the national education goals panel, and the American people. Washington, D.C.: NCEST.

Naylor, A. P., and K. Jenkins. 1988. An investigation of principal’s perceptions of library media specialists performance evaluation terminology. School Library Media Quarterly 16, no. 2: 234–43.

Neuman, D. 1993. Alternative assessment: Promises and pitfalls. In Assessment and the school library media center, ed. C. C. Kuhlthau, 67–74. Englewood: Colo.: Libraries Unlimited.

Paulson, L. F., P. R. Paulson, and C. Meyer  1991. What makes a portfolio a portfolio? Educational Leadership 49, no. 5: 60–63.

Person, D. G. 1993. A comparative study of role perceptions of school library media specialists and information power guidelines. Ph.D. diss., New York Univ..

Pickard, P. W. 1993. The instructional consultant role of the school library media specialist. School Library Media Quarterly 21, no. 2: 115–21.

Rothman, R. 1996. Taking aim at testing. Educational Psychology 98, no. 13: 205–208.

Sperschneider, W., and K. Bagger. 2000. Ethnographic fieldwork under industrial constraints: Towards design-in-context.” NordiCHI2000 Proceedings, Oct. 23–25: 1–7. Accessed Jan. 20, 2004, www.nwow.alexandra.dk/publikationer/NordiCHI2000.pdf.

Sketch Information Literacy Committee. 1998. Library media and technology standards. Wisconsin: Sketch School District Printing Services.

Stoddard, C. G. 1991. School library media professionals in instructional development activities: Perceived time expectations and the identification of variables that enhance or limit instructional development practices. Ph.D. diss., Utah State Univ.

Stripling, B. 1993. Assessment of student performance. In Helping teachers teach: A school library media specialist’s role, ed. P. M. Turner, 140–57. Englewood: Colo.: Libraries Unlimited.

Turner, P. M., ed. 1993. Helping teachers teach: A school library media specialist’s role. Englewood, Colo.: Libraries Unlimited.

Van Deusen, J., and J. I. Tallman. 1994. The impact of scheduling on curriculum consultation and information skills instruction. School Library Media Quarterly 23, no. 1: 17–37.

Wilson, M., and R. Adams. 1996. Evaluating progress with alternative assessments: A model for title 1. In Implementing performance assessment: Promises, problems, and challenges, ed. M. B. Kane, 39–61. Mahwah, N.J.: Lawrence Erlbaum Associates.

Zessoules, R. and H. Gardner. 1991. Authentic assessment: Beyond the buzzword and into the classroom. In Expanding student assessment, ed. V. Perrone, 47–71. Alexandria, Va.: Association for Supervision and Curriculum Development. 


Referee Record
Manuscript Submitted: September 2004
Board Approved: December 2004