Selected articles from previous issues of Prism: the newsletter from the Office for Accreditation

best of prism

The Bigger Picture, by Richard Rubin
The Role of Mission, Goals and Objectives for Program Reviewers, Richard Rubin
Library and Information Studies Education: Diversity/Equity, Jane B. Robbins
First-Timer's View, Melody M. Hainsworth
COA Pioneers Non-Visit Accreditation Process, Brooke E. Sheldon
LIS Accreditation in Canada, Karen Adams
Coordinating Program Reviews, by Jane B. Robbins
Planning to Plan, by Prudence W. Dalrymple
A Culture of Evidence, by Prudence W. Dalrymple


   from Prism, Fall 2008 (volume 15, number 2)

The Bigger Picture

Richard E. Rubin, Chair, ALA Committee on Accreditation, and Director, School of Library and Information Science, Kent State University

Just a short time ago, the Council for Higher Education Accreditation (CHEA), the body that recognizes accrediting agencies, issued an important report, U.S. Accreditation and the Future of Quality Assurance. The report, which is a general discussion of accreditation, is also a reminder that the accreditation of programs in library and information studies and the Committee on Accreditation operate within a much broader context: the accreditation of higher education institutions and programs. I thought it might be enlightening to identify briefly some of the trends that CHEA believes are affecting the broader aspects of accreditation in higher education.

Issue 1:  The extent to which a program meets societal needs, not just the needs of institutions. More attention is being paid to such questions as: Are programs functioning in the public interest as well as in the interest of the academic institution? Is the program attracting a diverse student body to serve the diverse needs of its population? Is there a substantive social return on investment? Are student graduation rates optimal?

Issue 2:  The decline in public credibility. Higher education has in general been considered a public good. Yet citizens no longer exempt these institutions from examination and criticism. The public is no longer willing to assume that accreditation is a guarantee of quality. Hence, policy makers may be more inclined to regulate higher education rather than to rely on voluntary bodies such as accrediting agencies. Our institutions and programs must be more transparent to those who provide the funding.

Issue 3:  Fiscal pressures. While the cost of higher education continues to rise, the funding does not keep pace. Professional programs such as LIS programs are likely to require more and more money, and hence will become more and more expensive to our students. Accreditation processes must continue to focus, at least in part, on the fiscal ability to maintain quality while also maintaining accessibility. In such an environment, “results” will be emphasized.

Issue 4:  Changing patterns of student attendance. Students are increasingly attending more than one institution before completing their degrees. Accrediting agencies need to examine issues such as transfer credits and program flexibility.

Issue 5:  Changing modes of instruction. Institutions of higher education are increasingly experimenting with new methods of delivery for instruction. Online programs are growing, and it is critical that accrediting agencies ensure that quality and standards are maintained while these new methods are implemented.

Issue 6:  Demographics. The pool of available students will become increasingly diverse.  Programs will need to recognize not just major ethnic and racial groups, but subgroups.  Accrediting bodies will need to increase their attention to the ability of programs to attract, retain, and be responsive to a diverse clientele.

Issue 7: Globalization.  Programs in higher education are increasingly expanding into other nations and recruiting on an international basis. Accrediting bodies must ensure that high quality facilities and services at all locations are maintained. In addition, accrediting agencies need to assess how well the needs of international students attending programs in the U.S. are met.

This list is just a sample of the larger forces that shape accreditation.  As we consider the role of the COA and our Standards, it is also essential that we understand the forces that influence accreditation as a whole.

Council for Higher Education Accreditation. U.S. Accreditation and the Future of Quality Assurance, prepared by Peter T. Ewell. Washington, DC: CHEA, 2008.
Available for order at www.chea.org/store/index.asp.

[top]


   from Prism, Fall 2006 (volume 14 number 2)

The Role of Mission, Goals, and Objectives for Program Reviewers

Richard Rubin, Member, ALA Committee on Accreditation and Director, Kent State University, School of Library and Information Science

Understanding and assessing the mission, goals and objectives of a particular program are critical activities in any program review. It is vital that External Review Panelists place in proper perspective the role of the ALA Standards for Accreditation of Master’s Programs in Library and Information Studies (1992) and the program’s stated mission, goals and objectives (MGO) when examining the program presentation. The standards and MGO’s have two distinct but closely related functions.

First, the standards are the lens through which one views the program. All aspects of the program presentation are viewed through this lens. Consequently, while reading sections of the program presentation, the panelist should be able to identify specifically which standards are being addressed. As issues arise during reading the program presentation, the specific standard under which the issue falls should be noted. It is very helpful to the Committee on Accreditation when the ERP report notes issues to be addressed by the program and identifies which particular standard or standards are implicated.

Second, the mission, goals and objectives identified in the program presentation establish the perspective or point-of-view from which the ERP views the program. There are many types of programs that can comport with the COA standards. The MGO’s help the program under review, the program reviewers, and the Committee understand what the program is about: what it emphasizes and how it approaches the field. The MGO’s help panelists and the Committee concentrate on the focal and unique aspects of the particular program. The more that is known about the perspective of the program, the greater the ability of the ERP and Committee to apply the standards in an appropriate and effective manner.

Panelists should keep in mind that there are tremendous variations in programs and their emphases. It is not the role of reviewers to impose their own perspective on the roles of programs—that is the responsibility of the program itself. However, the program presentation should be clear on the program’s MGO’s. The reviewer should be able to determine from the program presentation what the program’s MGO’s are and how the program accomplishes or meets those MGO’s, including the educational outcomes achieved or expected. In addition, the reviewer should also be able to determine how the program’s MGO’s are integrated into the MGO’s of the academic institution as a whole. This important role of the MGO’s in program assessment is the reason why the COA places such importance on systematic and ongoing planning processes.

One way for both the program and the program reviewer to evaluate a program’s planning activities is to address four questions:

   1. What types of planning is the program doing? (Broad-based, curricular, programmatic)
   2. How is planning accomplished? (Retreats, meetings, informal, surveys, systematic data collection and analysis)
   3. How often is planning conducted?
   4. Who is involved in the planning processes? (Internal and external constituencies)

Finally, the reviewer must examine how the program assesses the MGO’s. Among the questions to be addressed are the following:

   1. How does the program know it is doing the job?
   2. What outcomes does the program expect?
   3. What techniques does the program use and what data does it collect to support program assessment?

Program reviewers have a solid foundation on which to base their assessments: the ALA Standards. This foundation should be bolstered by clearly stated MGO’s in the program presentation. By viewing the program through the lens of the Standards and the perspective of the MGO’s program reviewers have the critical tools to evaluate subsequent sections of the program presentation and the program in its entirety.

[top]


   from PRISM Summer 2000 (volume 8, issue 2)

Library and Information Studies Education: Diversity/Equity

Jane B. Robbins, Dean and Professor at Florida State University School of Information Studies

Over the past two years, members of the Committee on Accreditation have held a number of discussions regarding the closely related but differentiated issues of diversity and equity. These discussions were most often engendered by portions of Program Presentations or External Review Panel reports that addressed issues of diversity in all standards areas but most frequently in the Curriculum, Students, and Faculty standards. This column represents my personal perspectives on diversity and equity; however, it has been discussed with members of the Committee on and Office for Accreditation.

This column is offered in the hope that it may be useful to all of us engaged in our accreditation enterprise.

In nations based on legal and ethical principles of individual equality and democratic participation, in this case the United States and Canada, diversity serves as a fundamental measure of representativeness. Diversity can be understood as the extent to which organizations at all levels of our society reflect and accommodate the demographic characteristics of the general population. Diversity as detailed in the Standards for Accreditation 1992 (p.5) includes “…age, ancestry, color, creed, disability, gender, individual lifestyle, marital status, national origin, race, religion, sexual orientation, or veteran status.” At its best, diversity promotes pluralism, understanding, and inclusion in the creation of dynamic and cohesive communities. In education and employment, measures of diversity can reflect concrete advances and achievements in implementing the democratic ideals of fairness and equality; yet these measures also serve to indicate systematic structural biases that deprive particular groups of power, opportunity, and economic independence.

The American Library Association promotes diversity as a core professional value, and advocates libraries as a great democratic institution, serving all people of all ages, income level, location, or ethnicity, and providing the full range of information resources needed to live, learn, govern, and work. Indeed, U.S. Census Bureau population estimates predict that the racial and ethnic balance of the population will continue to change over the next few decades, resulting in the slow evolution of a nation without one clear majority group thus increasing the imperative to assure that all groups have an equal opportunity to participate in society. However, if the library and information professions intend to credibly champion universal equality of access and opportunity for their public, then they must demonstrate their allegiance to these principles within their own profession and its organizational structures and hierarchies.

Equity means understanding and working affirmatively to amend historical and contemporary misrepresentation in key societal arenas, notably education and employment. Racism, sexism, and other forms of prejudicial exclusion serve to prohibit some people from reaching parity in our diverse societies; therefore, the existence of equity programs and evidence of actions taken in pursuit of their goals can be used to measure commitment to diversity. Such programs and actions should be essential and valued components of accredited library and information science programs. There is little evidence in Program Presentations and External Review Panel (ERP) reports that schools have developed equity programs and action plans to achieve diversity.

Education is the key effort to actualize diversity and equity not only because a master’s degree is a prerequisite to professional entry into the field, but also because educational attainment contributes to the creation of leaders and has a significant impact on earnings levels throughout workers’ lives. The U.S. Department of Labor reports improvement in the wage gap between men and women as well as certain ethnic and racial categories; still, pay inequities persist for women, minorities, and the disabled. The composite wage difference between these groups and their white male counterparts is frequently explained by significant differences in skills and educational levels (U.S. Department of Labor 1999). Further, the rapid pace of technological change may increase the wage gap between well-educated workers and those who are not. It is this situation that the library and information studies educational recruitment and retention efforts can address directly by providing a diverse student cohort with the skills necessary to compete in an information economy that accurately reflects the full diversity of American citizenship.

Many key higher education organizations, such as the Association of American Universities and the American Council on Education, have issued useful statements and action plans regarding diversity and equity; several of these are readily available via the Internet. It should be imperative that those responsible for our accredited programs be familiar with these documents and use them or other similar materials in the preparation of their diversity and equity activities. Over 15 years ago James Boyer (Eric Document 240 224) detailed a five-step process leading to truly multicultural education: awareness, analysis, acceptance, adoption, and actualization/advocacy.

My analysis of the development of library and information studies programs as expressed in Program Presentations and ERP reports is that we remain at the awareness and analysis end of this continuum with few of us moving through the acceptance, adoption and actualization stages. This is a seriously flawed condition and requires committed attention from our programs and their stakeholders.

[top]


   from PRISM Winter 2000 (volume 8, issue 1)

First-Timer’s View

Melody M. Hainsworth, PhD, International College, Naples, Florida

There is a first for everything and everyone has one. In this instance, this was my first COA accreditation site visit. I had been an external review panelist for several years, so I was really excited to get my first invitation to go onsite. I found there are a number of pieces involved to making a site visit successful, both from the COA team’s perspective and the host library school.

The team has a need for you to be timely, to carry your weight, to turn up where you should, when you should and, most importantly, to be prepared. This preparation includes the reading that must be done, the preliminary report which must be written prior to the visit, and the close to final draft before you leave the site. I noted that as a team, it takes only one member who isn’t ready to affect the others. We are all busy people and no one wants to carry someone else, especially since if you do not have the time to do the work you should not have volunteered.

Luckily, my first team had a seasoned Chair and I was the only first-timer. The other three members were ready to proceed at the first dinner meeting, were never late for meetings and collected data efficiently. They were a joy to work with. On my part, I made the decision to have my next-to-final draft completed before I left the site. Therefore, I had my laptop with me so I could work as much as I needed in my hotel room. I also lugged along every piece of printed material the school had submitted plus the information from COA. I did wish I had brought my portable printer.

Where do you start and how do you proceed, when it is your first time? Well, I made sure I responded quickly to the e-mails from our chair and COA staff with the information they needed. I set up my e-mail inbox folders and e-mail alert capabilities so no communication was missed. As soon as the chair said we should book our airfare, I phoned the travel agent. The library school and your chair need to know when the members are arriving for scheduling purposes. When the school e-mailed with the hotel information, I called the hotel directly to check on how to get from the airport to the hotel. I did all I could in preparing to ensure I would not be late for the organizational meeting.

When I began to get documents from COA, I scoured them immediately, reading everything as a whole. Documents included the accreditation guidelines, information on the recent changes and the role of team members. From those I understood that, although I was going to be responsible for writing one section of the report, I should be ready to observe and comment on all areas of the standards. This decision affected how I read the library school documents. I read their report as a whole, with no notes being taken. When I received my area of responsibility from the Chair, I read the school’s report as a whole again, looking for the comments in the area I was responsible for writing. I then created a rough graph for evidentiary information in my areas and went back to the report looking only for comments about those. It was from this graph and my general notes, that I began to get some ideas of what questions I might ask onsite and what documents I might need to see that were not in the report.

At the library school it can be very stressful for everyone. You have to remember that you are there to elicit information about their school, and subjugate any “this is how we do it at my school”. Conversations between team members at times are confidential and so space to meet and talk away from the faculty is key. As the team met on the campus in our assigned room, we reviewed each area of responsibility and what information we would be seeking. Before we arrived we decided whom we wished to interview and that schedule was reviewed onsite. Dress professional—no casual clothing except for evenings.

As luck would have it, I was assigned my first regional accreditation for a university with the Southern Association of Colleges and Schools (SACS) Commission on Colleges, which immediately followed the COA visit. It was a unique opportunity to really focus in on the accreditation process, both professional and institutional. SACS/COC had no external review panel members. They always have a COC staff person attend with the onsite team and often the state executive director of their licensing board joins the team. COC teams are three times as large and the institution must provide a meeting room and computer equipment at the hotel where the team stays in addition to the facilities onsite. Oh, and team members are paid $50.00.

Why go to the effort to do a site visit? It forces you to be familiar with your profession’s education credo and standards. You gain great ideas for your own institution. And you get to meet with other faculty in their home setting, outside of the conference milieu.

[top]


   from PRISM Spring 1999 (volume 7, issue 2)

COA Pioneers Non-Visit Accreditation Process

Brooke E. Sheldon

In the fall of 1997, the Committee on Accreditation conducted its first continuing accreditation process in which members of the evaluation team did not visit the campus. We believe that this was a pioneering experiment among professional associations and higher education accrediting agencies.

The non-visit evaluation was conducted at the Graduate School of Library and Information Science at the University of Illinois. The decision to experiment with a non-visit was made by COA in response to a proposal submitted by Illinois, which stated in part, “We recognize the panel’s need to supplement review of written evidence with interviews and observation. We have ideas on how to employ electronic mail, asynchronous and synchronous electronic forums, telephone and conference calls and video conferencing to allow panel members to communicate with the range of individuals who normally would be met onsite. We feel that these tools may in fact offer the panel opportunities to communicate with a wider range of individuals and to develop a more complete picture of our program than was possible when meeting selected individuals who happened to be onsite during the three to four days over which site visits have been scheduled in the past. While panel members would not be able to observe classroom based courses, they could participate as guests in synchronous electronic class discussions . . . they could also interview by phone or electronic mail selected students enrolled in various courses as well as all the faculty involved in our teaching program.”

The review was conducted by a team made up of Jane Robbins, dean, SIS, Florida State University, panel chair; Daniel Barron, professor, CLIS, University of South Carolina; Martin Dillon, executive director, OCLC Institute; Joan C. Durrance, professor, SI, University of Michigan; Carol C. Kuhlthau, Associate Professor, CILS, Rutgers; and Nancy K. Roderer, Director, Cushing/Whitney Medical Library, Yale University.

In the spring of 1998, a subcommittee of COA conducted an evaluation of the process. Members of the faculty, the External Review Panel, the Dean and Associate Dean were contacted individually. Students were also contacted via a general email questionnaire. All were asked to respond to a series of questions related to their perception of the experience.

For the most part (with two exceptions) faculty felt very much involved in the accreditation review process. As one noted, “I really would not want to be more involved in the review process—very time consuming.” Most faculty members felt that the experiment was worthwhile, and that “ALA should try it again as an alternative mode of review appropriate for well established schools.” As one faculty member expressed it, “Thank you for the opportunity to participate in this soon to be widely accepted alternate visit.” A bonus for one instructor was having the materials (documentation) available online. “It made the process more open and meant I could use the materials in my teaching.” (the administration course)

According to Dean Leigh Estabrook and Associate Dean Linda Smith “a key factor was the availability of a server that allowed the school to set up a separate team email group, and the ability to put our presentation in web form. We were aided by university resources that could be linked. Since most of our own internal school information already existed in electronic form, it was easily linked to the program presentation.”

They also noted the savings in travel and housing as “a valuable option for smaller schools.” Because interviews did not depend on people being on campus, it was viewed as an efficient use of faculty and staff time.

The Dean noted “it is important to schools intending to request this option to anticipate the kind of evidence that a team will need. This may seem obvious but any school has to supplement the information in the program presentation and it must be done quickly . . . we are a very online school, so the format and process was comfortable for us. It would be a punitive process for schools without extensive technical and support infrastructures.”

The process had mixed reviews from the External Review Panel. While some found the process effective and were very satisfied with the amount and timeliness of materials provided, and also felt actively involved, a frequent complaint was that the knowledge gained was secondary, not primary, and the panel was not able to interact effectively. “I learned something about online collaborative work in this experiment . . . namely, it is not an efficient or satisfying way to work.” One panel member said, “it does not save as much time as one would suppose. Consider my task of interviewing students . . . it would have been a breeze onsite assuming school was in session. In distance mode it took two or three weeks.”

We did not receive any student feedback from Illinois. It perhaps would have been better to contact a sample of students individually.

Finally, we return to the vital question to which we asked everyone to respond. Should COA try it again? And the responses may be best summed up by one of the External Panel members who said, “COA should decide after the team looks at the initial program document” and he added “I love the exchange in a more traditional visit (even at dinner a lot of information is exchanged) yet I think as we become more and more users of telecommunications, we need not move people as much as ideas and information.”

[top]


   from PRISM Fall 1997 (volume 5, issue 3)

LIS Accreditation in Canada

Karen Adams

The following excerpts are taken from a presentation by Karen Adams, executive director of the Canadian Library Association, to the Committee on Accreditation at the 1997 Spring Meeting. It is captured here for all of us, especially external review panels, to provide valuable insight into Canadian higher education and the context in which the Canadian programs are offered.

My presentation derives from conversations with the deans of the Canadian library schools. I will follow the broad headings in the standards themselves as an organizing principle.

1. Mission, goals and objectives

In general, Canadian schools are responsive to the planning imperative which underlines the 1992 standards. This is familiar work because of the financial strains placed on universities in the 1990s, and is likely to be pervasive in the institution being reviewed. At the same time, Canadian universities are placing higher value on excellence in research and teaching over the how-to, more practical focus that used to be tolerated in the professional schools. This creates a situation where, in spite of planning for survival within the parent institution, the school may not be well understood by the practitioners in the area. There is a push-pull between the practical needs of the practitioner and the university's emphasis on excellence in research and teaching.

2. Curriculum

The major difference in Canadian curriculum offerings is likely to be the absence of much material related to school librarianship. Provincial jurisdiction over the K–12 system means that standards for teachers are different in each province. The fact that there are ten provinces and two northern territories in Canada but only five provinces that are home to a library school has contributed to faculties of education in each province taking over education for teacher-librarianship. What can be a bread and butter program for a school in the USA will probably be non-existent here.

3. Faculty

Regarding faculty, I am told that some External Review Panel members have been concerned about the standard which prefers that faculty hold advanced degrees from a variety of academic institutions. Many Canadian faculty members have taken their masters in Canada at one of the seven schools, and then gone on to one of the three PhD granting organizations among the seven. Obviously, the variety of institutions is limited unless one leaves the country. This difficulty is compounded by Canadian immigration laws that require the hiring of Canadian citizens first, with clearance required to hire from outside the country.

Publication and research also have special constraints in the Canadian scene. Until 1992, there were two peer-reviewed journals in the field of library and information science. In 1992, the Canadian Library Association ceased publication of the CLJ, and it was merged with the Journal of the Canadian Association of Information Science to form the Canadian Journal of Information and Library Science. Opportunities for journal publication in Canada are limited to this one refereed quarterly journal of about 90 pages an issue. Making the decision to publish in a USAbased journal means that the article cannot be too Canadian or it will not be of interest, which poses a problem for those concerned with matters particularly Canadian. Similarly, there is only one publisher of library science monographs in Canada, the CLA. However, a best-seller for us is about 300 copies, and we cannot subsidize publishing from other Association activities. Again, this means that Canadian faculty members have limited opportunities for publishing. Given these constraints, when you see an impressive track record in Canada, it is truly impressive.

The major source of research funding for Canadian faculty is the Social Sciences and Humanities Research Council, a federal agency. It, too, is undergoing cutbacks, and even when research grants are available from SSHRC, they do not fund any indirect costs of research or of the institution's overhead costs. Although the data indicate that Canada and the USA have traditionally spent about the same proportion of their GDP on research, Canada's share has begun to decline, and this is affecting Canadian scholarship.

4. Students

The most consistent complaint about the accreditation standards as they currently exist is the perceived need for statistics about the multicultural, multiethnic, and multilingual student body. Canada's constitution includes a Charter of Rights and Freedoms which is based on the concept of human rights and focuses on equality, language rights, and the protection of the multicultural heritage of Canadians. Since the Charter came into force in 1982, court decisions have dictated that institutions may not ask the individual to provide information about race and physical ability. Such data can be provided voluntarily to the national federal agency Statistics Canada, but the agency is prohibited from making information about the individual public. This means that there can be no data about the multicultural breakdown of a class in Canada except the report of the naked eye observing visible minorities, a highly flawed process. It is important that the External Review Panel understand that statistics about the multicultural, multiethnic, and multilingual student body simply cannot be collected in Canada.

Canadian students appear less likely to be part time. One library school director indicated that 95% of his students were full time. This is caused by geography, with people moving to the cities where the schools are located to attend full time.

6. Administration and Financial Support

Here I was asked to take note that, especially in the smaller Canadian schools, the overall cutbacks in funding have led to administrative mergers in which one unit outside the school acts as administrative support to both the school and several other units. It is hoped that the focus can be maintained on whether or not there is adequate financial support to achieve the program's goal rather than placing undue emphasis on where the record keeping takes place. The External Review Panel must be encouraged to remember always that the purpose is to accredit the program, not the administrative entity. It is also important to note that Canadian schools operate on resources tied to a quota of students. If they took in more students, they would not receive any increase in resources or faculty. I want to thank the Committee on Accreditation and the staff of the Office for Accreditation for the opportunity to raise these matters that happen only in Canada. Overall, the accreditation process works well, and the COA has in recent years certainly been sensitive to the need for Canadian participation on the external review panels.

[top]


   from PRISM Spring 1996 (volune 4, issue 2)

Coordinating Program Reviews

Jane B. Robbins, Dean, School of Library and Information Studies, Florida State University

Among the most prevalent laments of LIS faculty, staff, and administrators is that there is a seemingly endless onslaught of assessment groups each wanting to bring its expertise to bear on our efforts; they want to aid us in assuring our various stakeholders that we do indeed know what we are doing and what we hope to be accomplishing. For too many of us it is not unusual to have periods of three or more years when each year contains a due date for an assessor’s review document. In addition to the specialized accreditors, COA and frequently NCATE, our university may conduct a review of our unit or one or more of our degree programs; or, the state may review all programs in our field throughout the state. While each review has unique requirements, their similarities predominate. Although some of us might prefer to deal with each reviewing group separately, I think it is safe to assert that most would wish to coordinate one or more of these reviews.

Over the past several months I have had the opportunity to chair a coordinated South Carolina Higher Education Commission and COA program review for the University of South Carolina’s College of Library and Information Science. It may be that the highly positive experience that all the participants agreed was achieved is attributable to some of the characteristics of the situation; i.e., only one university in the state offers LIS education. The college’s programs: Masters, Certificate of Graduate Study, and Specialist, are closely interrelated in terms of faculty and curricular offerings; and, all the parties were anxious to achieve positive results. However, I do not believe that this was a unique experience and do believe that others could achieve similar results.

The coordinated review entailed three key agreements: (1) The Higher Education Commission agreed to accept the COA Program Presentation document as the State's document and to request only demonstrably necessary additional information. For this review data for a longer period of time regarding enrollments and degrees awarded for all three programs was required. (2) The COA agreed to include as a participant observer on the external review panel, the state’s designee. (3) The COA was willing to allow the chair of the panel to serve as the state’s consultant. It might be that a member of the panel other than the chair could serve in the consultant capacity. During the on-site review, there were no more than three occasions when the state’s designee (the Commission’s Coordinator for Academic Programs) asked the chair/consultant questions specific to only the state’s needs; each was asked in private at convenient times. The chair/consultant and the state’s designee also conferred by phone two or three times prior to the site visit and then for about two hours before the first meeting with the full on-site COA panel and for about an hour after the site review concluded.

The state’s designee was fully informed of all the activities of the panel and received copies of the drafts and final version of the panel's report as well as of the accreditation decision. In addition, the chair/consultant prepared a separate report for the state after the COA work was completed. Approximately half of that report was made up of material from the panel’s report and the remaining half was information required by the state. The consultant’s report also included material of a rather instructive nature about the field of library and information studies for the audience of lay members who hear the state’s assessment presentation. It is important to note that this additional work by the consultant did not require further involvement by the college.

I do not know how many LIS schools have been able to coordinate internal university reviews with their COA reviews, but this was possible when I was at the University of Wisconsin–Madison. We had the Graduate School review our doctoral program in the same year we did our COA review; the faculty members of the Graduate School review committee found our COA document (this was in about 1985 so it was under the previous COA process) to be most informative. They stated that it greatly aided them in understanding the context of the doctoral program and thus in preparing their report to the Graduate School.

I commend the idea of cooperative reviews to any of you who wish to give one a try; certainly my experience as well as the reports of those with whom I have been involved in the cooperative review process agree that a great deal of unnecessary effort is displaced and there are virtually no identifiable drawbacks. I do hope that I can coordinate a three-way review of FSU’s SLIS in the near future in which the state, the university and the COA cooperate in a meta-review. It would be interesting while it was in process and delightful (for a number of years) when it was over. A simple call got the coordinated reviews about which I have reported started . . . so, pick up the phone!

[top]


   from PRISM Fall 1995 (volume 3, issue 4)

Planning to Plan

Prudence W. Dalrymple

As I write this column, about one third of the accredited LIS programs have had to confront the question: “How do I plan to plan?” This query is usually prompted by the requirement that programs submit a Plan for the Program Presentation one year before the scheduled review. The Plan serves several purposes, one of which is to ensure that programs begin preparing for their review in a timely manner; another is to give the Office and the Committee a more specific idea of what characteristics the school is seeking in the panel chair and panel members. In my capacity as consultant to programs, I have found that creating the plan is an important, but sometimes puzzling, step. In this column, I share with you some suggestions that have arisen from working with programs that have undertaken reaccreditation thus far. There is no required format for the plan, but based on my observations of the plans that have been submitted, I offer you these suggestions with the hope that others may find them useful.

1. The Program Plan (PP) is a good time to deal with the general tone or theme of the program presentation, i.e., will there be special areas of focus or emphasis in the Program Presentation? The PP seems to work best when the school takes a future-oriented approach, often building on any strategic planning documents, vision statements, or the like that may already be in existence or planned in the near future.

2. Consider the distribution of the labor needed to accomplish the Program Presentation. In the past, many faculties divided up into six groups, one for each standard. While this is certainly one way to approach the process, it can often work against success because it makes the Standards the focus of the process, rather than the program and its future. Where possible, it is advisable to work as a committee of the whole and then, once a general direction and objectives for the program have been identified, to break into committees or task forces that can have a useful life beyond the accreditation cycle.

3. As you develop the plan, indicate whether any other external reviews such as NCATE, university review, Southern Association etc. are scheduled at or around the time of the COA review. COA encourages schools to coordinate these reviews so as to maximize the university resources dedicated to external evaluation.

4. Determine and describe plans for any data collection efforts that will be necessary for the review. For example, do you plan to conduct focus groups, structured interviews, mail or telephone surveys with your constituent groups and/or students and alumni? The ERP chair and I are happy to assist you in planning these activities, and it is important to lay them out with a timeline for design of the instrument (if not already in existence), data collection, analysis, and reporting.

5. Begin thinking about the format of the PP. Generally, focusing on the mission, goals, and objectives of the school are of paramount importance. Setting measurable objectives with benchmarks or targets to measure accomplishment is an indication of a planning and evaluation component resident within the school. There is a strong emphasis on planning and evaluation in the 1992 Standards, and it is important and beneficial to approach the PP as a planning and evaluation activity that is developmental for the school.

6. Once you have determined how you want to approach the PP (it is both a process and a document), you can begin to determine how the work will get done. Again, when the PP is approached as a process, taking the time now to establish the planning and evaluation mechanisms in your school (if they don’t already exist) is an investment that will continue to benefit the school long after the actual review takes place.

7. Finally, you can map your plan against the Standards to determine the degree to which your program conforms to each standard. As more and more programs gain experience with the revised accreditation process, others can take advantage of prior experience through collegial consultation and sharing. At the ALISE conference in San Antonio, Program Presentations will be on display so that you may contact your colleagues for additional insights.

[top]


   from PRISM Spring 1995 (volume 3, issue 2)

A Culture of Evidence

Prudence W. Dalrymple

One of the more important and interesting issues raised by the new accreditation process is the role of evidence in both the program presentation and the evaluation review. Many deans remember the 105 questions that were asked in the Guide to the Self-Study, and the “Sources of Evidence” that were listed in the 1972 Standards. In my experience, most deans and faculties are relieved not to be restricted to a standard format and required evidence. On the other hand, chairs and members of the panels who evaluate the program are charged to base their assessments on observations of reality. Knowing how much data to include in a program presentation involves seeking a balance between making unsupported statements and including vast amounts of evidence “to be on the safe side.” Achieving this balance is not easy, particularly in the early stages of the new accreditation process, and it is a struggle to provide appropriate substantive data to support judgements. I’d like to propose that this problem can be addressed by striving to create a “culture of evidence.”

By a “culture of evidence” I mean a commitment to grounding decisions on data. All institutions collect numeric and factual data that can be tapped to substantiate and strengthen what is said in a program presentation. Often all that is needed is the incentive to analyze and present it in a meaningful way. As the LIS community works together to refine the picture it has of itself as a field, there is ample opportunity to share experience on how to mine the data that is available and to present it in a clear and cogent way—not only for accreditation purposes, but to external constituents.

Planning a program presentation can be likened to designing a research project where the research question is “Does this program meet the Standards?” Additional questions might be: “Does the program meet its own objectives?” And just as importantly, “Why does it fail to meet these objectives?” As any good researcher knows, the data collected is driven by the question asked. Since the accreditation process is highly influenced by the individual nature of the program and its parent institution, not all programs will ask the same questions or collect the same data. The flexibility of the new process allows schools to individualize their approach to the program presentation. But just as research requires that the investigation be conducted in a systematic way, and that the data, analysis and results made available for peer review, so the program presentation and the accreditation review require that statements be open to further investigation. As academics, we require nothing less of our students.

The Committee on Accreditation is committed to basing its decisions on the data provided. COA, as it refines its annual reviewing process, is striving to develop a database of numeric and factual data to draw upon in analyzing the reports that accredited programs submit. Collecting the right data, analyzing it appropriately, and reporting it accurately are all areas of concern in the accreditation process. They are all essential elements in creating a “culture of evidence.”

[top]