A Case Study of One District's Implementation of Information Power

Kathy Latrobe, Professor, School of Library and Information Studies, University of Oklahoma, and Anne Masters, Director, Media Services and Instructional Technology, Norman Public Schools

One district sought to institutionalize the implementation of Information Power: Building Partnerships for Student Learning into its established planning and evaluation activities. The rationale for this implementation strategy was that the stakeholders could be simultaneously informed about the principles of library media programming that support the standards and also enabled to evaluate their programs and apply their findings to annual action plans. This implementation strategy modeled the principles of Information Power in that it was a collaborative endeavor, overseen by district and building-level leaders who utilized technology. As reported in this case study, teachers, library media specialists, and principals evaluated their building-level library media programs according to the principles of teaching and learning, and they also evaluated the involvement of stakeholders in implementing those principles. By responding to the survey instrument, "Assessing the School Library Media Program and Its Partnerships" ( appendix), these stakeholders provided data to inform future district implementation strategies and to inform decision making at the building level. Descriptive statistics, including correlation coefficients for the relationship of program involvement to program progress, thus can inform discussions for developing building-level action plans that include library media programming. Teachers' responses indicated consistently positive correlations between the school community's involvement in the library media program and positive assessments of the program. Furthermore, the district's secondary library media programs had higher average ratings for teaching and learning activities than did the elementary programs.

The Evaluation Questions |  Procedures |   Assessment Instrument |  Data Reported to Schools by the District Coordinator |  Implications and Conclusions |  Notes |  Works Cited |  Appendix |  Key AASL Resource

This case study documents the initial implementation of Information Power: Building Partnerships for Learning(American Association of School Librarians 1998) in a school district in the southwestern United States. This specific district was not selected for study because it is typical but rather because it is exemplary and represents best practice. It is a past winner of the American Association of School Librarians' Encyclopædia Britannica Award for excellence in library media programming, and it is one of four districts in the United States to win the 2000 Model Professional Development Award from the United States Department of Education. In February of 2001, the Midcontinent Regional Education Laboratory (McREL), one of ten regional educational laboratories sponsored by the Office of Educational Research and Improvement of the United States Department of Education, sent a research team to the district to document its professional development strategies.

Following the publication of Information Power: Building Partnerships for Learning in 1998,[ 1] this district began a three-semester introduction of the new guidelines to library media specialists, administrators, and classroom teachers. The goal of the district's long-term implementation strategy was that the 1998 guidelines would be institutionalized as an ongoing force for program change, not merely the subject of a series of workshops. A primary tool in this implementation process was "Assessing the School Library Media Program and Its Partnerships" ( appendix), a survey instrument that was used as a communication device, a basis for reflection on practice, and as a source of data identifying program strengths and weaknesses. The district's approach was built upon and guided by its well-established practice of collaboratively integrating planning and evaluation.

The inseparability of planning and evaluation [ 2] is consistent with Information Power: "Program assessment is integral to the planning process" (108). Furthermore, Information Power emphasizes the use of a team approach to evaluation: "Assessment is collaborative and based on sound principles related to learning and teaching, information literacy, and program administration" (108). The guidelines also state that

In close collaboration with teachers, students, administrators, and other members of the learning community, the library media specialist develops and implements an assessment cycle that guarantees continuing attention to the critical role of the program and its services within the school. The library media specialist's assessment plan follows specific, formal steps to focus attention on a variety of issues: student learning, the place of information literacy for student learning within the curriculum, the role of information technology in the school, the quality of facilities and resources, and the quality and relevance of policies and procedures. (108–9)

Therefore, consistent with the guidelines, the district designed the implementation process to encourage collaborative, ongoing assessment and planning with a focus on Information Power's basic principles of effective library media programming.

The principles of planning and evaluation articulated in Information Power have been integrated into good practice for the last half century. In analyzing the trends of school library media evaluation, Joy McGregor (1998) has identified significant developments across the twentieth century. Noting the integration of planning and evaluation during the 1950s, she cited A Planning Guide for the High School Library Program (Henne, Ersted, and Lohrer 1951), which described the use of evaluation in developing improvement plans. During the 1950s school librarians also utilized collaborative evaluation, a trend that continued into the 1960s, especially with its use modeled by the library programs recognized and supported by the Knapp Foundation (McGregor 1998). The Association for Educational Communications and Technology drafted (1976) and later published Evaluating Media Programs: District and School, A Method and an Instrument, which "included input from teachers, students, and media staff and provided for future planning throughout" (1980, 145). An important development of the 1980s was Retta Patrick's recommendation of a technique of data collection for the purpose of developing action plans for improvement (1985). Significantly, Patrick concluded that a data collection technique could define expectations that could be credited with positive change. In summing up the past decade, McGregor wrote that such trends as formative evaluation, emphasis on accountability and improvement plans, and collaboration among constituents in the evaluation continued throughout the nineties (1998, 148).

Trends for the increased integration of planning and evaluation activities for library media programs reflect the larger scope of contemporary educational reform. Writing about the reform of program evaluation in 1980, Lee Cronbach noted, "The process by which society learns is evaluation, whether personal and impressionist or systematic and comparatively objective" (12). The concept that the purpose of assessment is to educate guided the development of a series of evaluation handbooks published by the National Study of School Evaluation (NSSE).[ 3] In cooperation with the American Association of School Librarians (AASL), NSSE developed Program Evaluation: Library Media Services (2000) which is a guide for the evaluation of library media programs and student learning that occurs in those programs. As in this case study of the implementation of Information Power in a southwestern school district, Program Evaluation emphasizes the standards and principles of excellence in the national guidelines and the process of collaboration.

The school district has traditionally integrated planning and evaluation in the development of annual building-level improvement plans. School sites continuously collect and study profiling data, establish new site goals in the fall semester, and work throughout each school year implementing action plans created to address identified needs. Site visits are made at the end of the academic year by the superintendent and other central office administrators to assess the progress toward each site's goals. A survey instrument for library media programming provides additional profiling data and opportunities to address the ways that improved collaboration in the library media program can contribute to overall school improvement.

Following the publication of Information Power, the district's school library media specialists focused on planning and evaluation as primary modes for implementing the new guidelines, and they developed the following schedule:

  • In the fall of 1998 they analyzed the principles and implications of the new guidelines and began to document and share across the district the ways information literacy standards were integrated into curricula in all grade levels.

  • In the spring of 1999 building-level school library media specialists met jointly with the district's principals and central administrators to collaborate on the implications of the 1998 guidelines. At an in-service workshop, building-level library media specialists presented their developing compilation of examples of information literacy standards integrated throughout the district. Also, in building-level teams library media specialists and principals completed "Assessing the School Library Media Program and Its Partnerships," a questionnaire developed from Information Power (AASL 1998) (see appendix). The results of the collaborative assessment by library specialists and principals are indicated in tables 1–4 under LMS/Principal (Team) Responses.

  • During the fall semester of 1999, information literacy standards and basic principles of Information Power were introduced to the district's teachers. As a follow-up to that introduction, the district's teachers completed the first section of "Assessing the School Library Media Program and Its Partnerships," considering and analyzing the teaching/learning activities of their own building-level school library media programs according to the principles of the 1998 guidelines.

Thus in the course of three consecutive semesters, three sets of stakeholders (school library media specialists, administrators, and teachers) had considered aspects of their own school library media programs in terms of the 1998 guidelines and also in terms of the level of participation by those sets of stakeholders.

[ Back to Top]

The Evaluation Questions

The district's fall 1999 implementation efforts focused on the audience of teachers and on Information Power's principles for teaching and learning (AASL 1998, 58). Focusing the attention of teachers on the principles of teaching and learning was a priority because student learning is central to the mission of the district's library media programs and to the spirit of Information Power. And although teachers are certainly to be involved in information access and program administration, they are most directly related to the teaching and learning components of the program. Also underlying this focus was the understanding that teachers needed to be first introduced to the new guidelines in ways that are most relevant to their needs and goals.

Integral to this implementation process were questions that could inform district and building-level planning and evaluation of school library media programs.

  • How did teachers rate library media program progress according to the guidelines' principles for teaching and learning?
  • How did teachers rate the participation of the library media specialist, the principal, and the other teachers in their buildings?
  • Were teachers' ratings of library media programs consistent with that of their building-level library media specialist/principal teams?
  • Were teachers' perceptions of stakeholder participation consistent with those of the library media specialist/principal teams?
  • Were there differences in program progress or program participation between elementary and secondary schools?
  • What is the relationship between program progress and program participation?

Although answers to these questions cannot be generalized beyond the district, the assessment methods themselves may be adapted and adopted (according to results of this evaluation study) for planning and evaluation in other library media programs. Furthermore, the data analysis may develop into research studies, especially those on the relationship of program participation to program satisfaction, the differences in needs and perceptions among elementary and secondary school library media specialists, and issues of library media program evaluation.

[ Back to Top]

Procedures

The district's teachers were introduced to the principles of Information Power (1998) and to the evaluation instrument in a series of meetings held at various school sites. Participating were teachers from fifteen elementary schools (grades K–5), four middle schools (grades 6–8), and two high schools (grades 9–12). Differences between the scheduled school day in elementary and secondary schools required different implementation meeting times and sites. Meetings for elementary teachers were organized in sets of three schools, and the meetings occurred in the most central of the three schools at the end of a school day. Secondary teachers met in their own schools one morning before their classes began. Teachers were given an overview of the new guideline principles, an explanation of the information literacy standards, and an introduction to the assessment instrument. Teachers completed the assessment instrument anonymously.

In these meetings with teachers, as with the introduction of the guidelines to library media specialists (fall 1998) to administrators (spring 1999), the district coordinator emphasized the advocacy of school library media programs, and the theme of the in-service meetings was the celebration of progress. In addition to a summary of the literacy standards, teachers received gifts of a mechanical pencil (with the inscription, "Student achievement is the bottom line") and a small notepad with an information literacy standard printed at the bottom of each page. Teachers were motivated to return the completed assessment instrument by participation in building-level drawings for books and other materials provided by the district and a local bookstore. Of the district's 781 teachers, 523 (67%) completed the assessment instrument.

[ Back to Top ]

Assessment Instrument

"Assessing the School Library Media Program and Its Partnerships" (see appendix) was designed to fulfill a range of information needs. For each audience it was used first to create awareness of the principles that,

were identified and developed by the Information Power Vision Committee, reviewed and commented upon by the profession, and approved by the AASL and AECT boards as the cardinal premises on which learning and teaching [as well as program administration (AASL 1998, 101)] within the effective school library media program is based (58, 83).

Its structure emphasized not only the guiding principles for effective programming but also the concepts of partnership and collaboration that are basic to Information Power. Beyond developing awareness, respondents to the assessment had an opportunity to evaluate the implementation of the guiding principles for teaching and learning within their building-level library media programs and to assess the involvement of teachers, principals, and library media specialists. Building-level library media specialists gathered data that could later be shared as input for development of collaborative action plans for their library media programs as well as their overall school programs. The assessment instrument also made it possible for the district library media coordinator to monitor the long-term implementation process of the new guidelines within the district. Thus the assessment instrument facilitated communication and data collection among various audiences, and it was basic to institutionalizing implementation efforts into the district's annual action plans.

The development of the assessment instrument and its application to planning and evaluation evolved from "Assessment of the Building Level Library Media Program," developed in 1988 by Kathy Latrobe, Mildred Laughlin, Robert Swisher, and Anne Masters (Latrobe 1992, 43–45). The basis of the 1988 instrument was Information Power: Guidelines for School Library Media Programs (AASL and AECT 1988). The 1988 instrument listed the basic guidelines but did not integrate aspects of program partnerships. The district's tradition of investing in planning and evaluation activities is illustrated by the fact that school library media specialists and principals applied the first instrument to the planning and evaluation of library media programs during AASL's national implementation teleconference in the fall of 1988. Using data from that assessment, the district's school library media specialists developed action plans that guided their building-level programs into the 1990s.

[ Back to Top]

Data Reported to Schools by the District Coordinator

The teachers' responses for program progress and program participation across the principles of teaching and learning were analyzed using descriptive statistical applications of SPSS 8.0 for Windows. Means were calculated for the entire set of the district's schools, for elementary schools and secondary schools, and for individual schools ( tables 1–4). For demonstration purposes, tables 1–4 include means for one elementary school (Elementary School X) and one secondary school (Secondary School Y). These two schools were arbitrarily selected to show how data was presented to building-level school library media specialists and administrators.

Program progress was rated on a scale of 1–5 with 5 being the highest level of program approval. Program participation was rated on a scale of 1–5 (1 for no awareness; 2 for awareness; 3 for collaboration; and 4 for leadership). The correlation coefficients of participation levels to program progress were calculated (Pearson's r) for the district as a whole ( table 5). The calculation of correlation coefficients for entire sets of variables was justified because the results were used descriptively rather than inferentially.

[ Back to Top]

Implications and Conclusions

The purpose of the data collection and analyses was not to rank programs or schools but rather to provide support for collaborative planning to be led at the district by the library media coordinator and at the building level by the school library media specialist. The district coordinator, as a consultant, specifically sought a collaborative, not a competitive environment; thus, schools were presented summative data for the district and specific school data only for their own sites. Of interest at the district level were data indicating the following:

  • Means for program progress across the ten principles ranged from 4.00 to 4.19, indicating teachers' approval of the programs within the areas of teaching and learning.
  • Means for program progress for secondary schools exceeded those for elementary schools on nine of the ten principles of teaching and learning.
  • Without exception, there was a positive correlation of program participation to program approval.
  • Correlation coefficients indicated that among library media specialists, principals, and teachers, the one whose behavior was the best predictor of program progress was the library media specialist.

Shaping future district planning will be the indication that higher levels of participation among teachers, administrators, and library media specialists lead to higher levels of program approval and that differences among schools emphasize the importance of the collaborative development of unique site-based improvement plans. For continuing implementation efforts, the results will inform in-service providers who may choose to showcase the strengths of particular schools. Significantly, the strength among secondary schools across the principles of teaching and learning challenges the sometimes held assumption that the teaching and learning functions of the library media program (and the guidelines as a whole) are more relevant at the elementary levels.

Data on elementary school X and secondary school Y ( tables 1–4) illustrate how this implementation assessment could shape discussions, planning, and future data collection. For example, considerations in elementary school X may include an analysis of the discrepancies among the perceptions of the stakeholders (those of teachers and of the library media specialist and principal team); and the development of an improvement plan on targeted principles (e.g., the program's integration into the curriculum or its link to the larger community).

Secondary school Y may choose to investigate how teachers' perceptions of program progress relate to their expectations and practice; focus a future evaluation on the guidelines' second set of principles (those of information access and delivery); and develop a workshop or conference presentation on ways the program promotes the understanding and enjoyment of literature.

Individual schools, as well as the district as a whole, might utilize other data collection techniques and instruments to gain perspective on specific program aspects, considering such useful sources as Nancy Everhart's Evaluating the School Library Media Center (1998). In Forecasting the Future Wright and Davie emphasize the value "of multiyear planning and evaluation" and "argue strongly for lots of formative evaluation procedures and a very limited number of summative evaluations" (1999, 153). They conclude that multiyear planning based upon an array of evaluation procedures enables the "library media program . . . to be self-correcting as the currents of the instructional program change" (153).

Although the conclusions in this case study cannot be generalized beyond the district, they do suggest the appropriateness of future action and theoretical research studies on implementation activities and especially on the role of stakeholders' collaboration and participation, stakeholders' expectations, and the integration of planning and evaluation. Furthermore, there is a need for the study of data collection methods and instruments that can best inform the practice of planning and evaluation within library media programs.

[ Back to Top]

Notes

  1. Information Power and related products are available for purchase from the American Association of School Librarians (AASL) at AASL Information Power Products Web page. AASL also makes available free PowerPoint presentations to guide media specialists in sharing Information Power with parents, teachers, and administrators. You can find these presentations on the  AASL Information Power Basic Implementation Kit Web page. [ BACK ]

  2. A brief overview of the principles of program planning and evaluation can be found on the Northwest Regional Educational Laboratory Web site at www.nwrel.org/eval/evaluation/planning.shtml. [ BACK ]

  3. You can learn more about the National Study of School Evaluation and their publications from their Web site at www.nsse.org. [  BACK ]

[ Back to Top]

Works Cited

American Association of School Librarians and the Association for Educational Communications and Technology. 1988. Information power: Guidelines for library media programs. Chicago: ALA.

———. 1998. Information power: Building partnerships for learning. Chicago: ALA.

Cronbach, L. 1980. Toward reform of program evaluation. San Francisco: Jossey-Bass.

Everhart, N. 1998. Evaluating the school library media center. Englewood, Colo.: Libraries Unlimited.

Henne, F., R. Ersted, and A. Lohrer. 1951. A planning guide for the high school library program. Chicago: ALA.

Latrobe, K. 1992. Evaluating library media programs in terms of Information power. School Library Media Quarterly 21:37–45.

McGregor, J. 1998. Determining value: Library media programs and evaluation. In The emerging school library media center: Historical issues and perspectives, ed. by K. Latrobe. Englewood, Colo.: Libraries Unlimited.

National Study of School Evaluation (In collaboration with the Alliance for Curriculum Reform and representatives from the American Association of School Librarians). 2000. Program evaluation: Library media services. Schaumburg, Ill.: National Study of School Evaluation.

Patrick, R. 1985. Effect of certain reporting techniques on instructional involvement of library media specialists. Drexel Library Quarterly 21:52–68.

Wright, K., and J. Davie. 1999. Forecasting the future: School media programs in an age of change. Lanham, Md: Scarecrow.

[ Back to Top]

Appendix

Click here to download the appendix in PDF format.

[This file must be viewed with Adobe Acrobat Reader version 4 or higher.]

Key AASL Resource

School media specialists who want to explore effective methods in program evaluation should read the new publication  A Planning Guide for Information Power: Building Partnerships for Learning with School Library Media Progam Assessment Rubric for the 21st Century.

[ Back to Top]

Referee Record
Manuscript submitted: August 2000
Revised: January 2001, March 2001
Board approved: April 2001