Broken Links: Undergraduates Look Back on their Experiences with Information Literacy in K–12 Education
Don Latham, PhD, Associate Professor, College of Information, Florida State University, Tallahassee.
Melissa Gross, PhD, Associate Professor, College of Information, Florida State University, Tallahassee.
In the past decade information literacy has received increasing emphasis in K–12 and postsecondary education, yet the information literacy skill levels of high school and college graduates continue to vary considerably. This report compares findings across a subset of data collected in three independent research studies focusing on students’ conceptions and perceptions of how they have learned what they know about information literacy. Competency theory, which posits that low-skilled individuals in some knowledge domains are often unable to recognize their deficiencies and therefore tend to overestimate their abilities, is used as the theoretical framework in each study. Data on participants’ previous experiences with information literacy instruction was collected through surveys or interviews. A majority of students reported that they were largely self-taught, but some also reported having received instruction from school library media specialists (SLMSs) and, to a lesser degree, public and academic librarians. Overall, low-performing students tended to identify peers as sources of knowledge while proficient students tended to identify SLMSs and teachers as sources of knowledge. These findings have important implications for researchers and practitioners in developing information literacy instruction for low-performing students.
In the past decade, information literacy—defined as the ability to access, evaluate, and use information effectively and ethically—has received increasing emphasis both in the K–12 and higher-education arenas as a cornerstone for both lifelong learning and success in the twenty-first century. Information Power: Building Partnerships for Learning states the most influential set of information literacy standards in the K–12 environment: “Students must become skillful consumers and producers of information in a range of sources and formats to thrive personally and economically in the communication age” (AASL/AECT 1998, 2). More recently, the American Association of School Librarians (AASL) has confirmed the complexity and importance of information literacy skills in its Standards for the 21st-Century Learner (AASL 2008). Nor is the issue strictly one of individual success and fulfillment; according to Information Literacy Competency Standards for Higher Education, the most widely used source for standards in the postsecondary education environment, “The uncertain quality and expanding quantity of information pose large challenges for society. The sheer abundance of information will not in itself create a more informed citizenry without a complementary cluster of abilities necessary to use information effectively” (ACRL 2000). As a result, many states, accrediting agencies, schools, and colleges now include information literacy as part of the competencies that students should be able to demonstrate. Both Information Power and the Information Literacy Competency Standards were developed by information professionals, and, while encouraging integrated information literacy instruction across the curriculum, are aimed largely at school library media specialists (SLMSs) in the case of the former and college and university librarians in the case of the latter. Since the introduction of these standards, a veritable cottage industry has grown up around the development of best practices and assessment toolsrelated to information literacy instruction, 1 but surprisingly little research has involved talking to students at any leve, about their own perceptions of how they have learned what they know about information literacy and how they prefer to learn new skills. The three studies described below address that gap in the literature by focusing on the conceptions and self-perceptions of college undergraduates about information literacy, the relationship between their self-perceptions and their actual skill levels, and their affective experience and process when searching for information related to self-generated and imposed information-seeking tasks. Each of these studies had some overlap in the kind of data collected. What follows is a presentation and discussion of a subset of data collected in those three independent research studies, specifically the data related to students’ perceptions of how they have learned what they know about information seeking.
[ back to top]
SLMSs have responded to the AASL and Association for Educational Communication and Technology (AECT) standards by implementing information literacy instruction in their schools, but they have faced challenges in doing so. For one thing, state-level information literacy standards are often buried within the standards for different subject areas (Harris 2003). For another, the amount of time SLMSs have to devote to instruction varies considerably, depending on staffing and grade level. A recent national survey, for example, found that the median amount of time spent per week on instruction by elementary school library media specialists is fifteen hours, for middle school library media specialists it is ten, and for high school media specialists it is eight (AASL 2007).
Research indicates that SLMSs do make a difference in students’ information literacy skill levels. In a study of students in a California community college information literacy course, Smalley (2004) found that those who came from high schools with librarians performed much better on both mid-course and final assessments than those students who came from high schools without librarians. Moreover, numerous studies in various states indicate that more time spent on information literacy instruction results in higher scores on academic achievement tests (see, for example, the studies summarized in Lance and Loertscher 2003). Of course, what constitutes information literacy instruction varies from school to school. In a nationwide survey of high school library media specialists, Islam and Murno (2006) found that the ACRL Information Literacy Competency Standard taught most frequently was number five, which involves the ethical use of information (specifically, proper documentation of sources); the skill taught second most frequently was number three, which involves the critical evaluation of sources; and the skill taught least frequently was number four, which involves the effective use of information.
Still, despite the fact that many SLMSs are providing information literacy instruction, significant numbers of students graduate from high school ill prepared for college. A national survey conducted for Achieve discovered that 40 percent of recent high school graduates who went on to attend college felt that they had gaps in their ability to do research, with 10 percent reporting large gaps (Peter D. Hart 2005). Among college instructors, 59 percent felt that their students were poorly prepared to do research (Peter D. Hart 2005). And a study by the Educational Testing Service (2008)found that of three thousand college students and eight hundred high school students who took the ICT Literacy Assessment Core Level Test, only 13 percent scored as information literate.
[ back to top]
The overarching theoretical framework used in all three studies is competency theory. Developed by Kruger and Dunning in the field of psychology, competency theory suggests that individuals with low skill levels in some knowledge domains are unlikely to be able to recognize their own deficiencies or to recognize competence in others (Kruger and Dunning 1999). Such individuals consistently rate their skill levels as “better than average” and tend not to revise their self-assessments even in the face of evidence to the contrary. One key question of our research has been whether competency theory pertains to the domain of information literacy—and, if so, what might the implications be for librarians in developing effective information literacy instruction for low-skilled individuals?
These three studies have also been informed by the notion of imposed queries, as described by Gross (1995). Imposed queries are situations in which one person is seeking information on behalf of another, as is the case with school assignments where the teacher imposes the information-seeking activity on the student. Imposed queries may be contrasted with self-generated queries in which the information seeker is motivated by a personal need or interest, as is the case with seeking information related to a hobby, health issue, or purchasing decision. Another question our research attempted to asnwers was whether students perceive different skill sets as required in the two different kinds of information-seeking situations—and, if so, what are the implications for librarians in developing instruction to promote and reinforce both sets of skills?
Our research also has drawn on the methodological framework provided by Bruce’s 1997 study of higher education administrators’ conceptions of and experiences with information literacy. Using a phenomenographical approach, Bruce conducted interviews with administrators to discover their varying conceptions. The results led her to formulate a relational model of information literacy, i.e., one based on the relations between people (her interview subjects) and information literacy as opposed to a behavioral model that emphasizes a list of attributes. An important question for research attempted to answer was what are students’ conceptions of and experiences with information literacy—and, specifically, information literacy instruction? The answer to this question has important implications for how SLMSs, teachers, and administrators develop and implement information literacy instruction in the K–12 environment.
[ back to top]
The initial research project was a pilot study of undergraduate students enrolled in a college-level information literacy class at Florida State University. The course, “Information and Society,” is a Web-based, asynchronous class that explores various issues related to living in the Digital Age including information literacy, the information search process, intellectual property, information security, identity theft, and personal information management. Students in two sections of the class taught over two different semesters (fall 2006 and spring 2007) were asked to complete a three-part assignment at the beginning of the course before they had received information literacy instruction. The two classes were taught by a doctoral student in information studies, and, aside from the fact that one of the researchers helped in developing the course, the two researchers were not involved in the two classes in any way. Students were not graded on the assignment per se, but they received one-third of a credit for completing each part. Though the assignment was a required part of the class, participation in the research study was not. Only those assignments for which informed consent was given were analyzed in this project, and all identifying data was removed from the assignments before they were analyzed. The assignment was pretested in a previous semester (spring 2006) and then modified on the basis of the pretest results.
The first part of the assignment asked students to complete a self-generated, information-seeking task in response to a personal question or information problem they were currently facing. Students used a worksheet to keep a record of their expectations about the assignment, the steps they took in seeking information, the search tools they used, the sources they consulted, and their evaluation of the experience. The second part of the assignment asked students to complete an imposed information-seeking task. Specifically, students were asked to find six sources (two Web sources, two sources from an electronic database, and two print sources) on a topic from the course syllabus (such as information security, identity theft, downloading music, etc.) and to compile a bibliography of those sources using American Psychological Association (APA) citation format. Students used a worksheet to keep a record of their expectations about the assignment, the steps they took in seeking information, the search tools they used, the sources they consulted, and their evaluation of the experience. The third part of the assignment asked students to respond to a series of questions designed to elicit their views of their information-seeking skills and experiences. One question on part three asked, “Please describe any instruction you’ve had in how to use the library or search for information online?” Students were given a list of eleven choices and they were asked to select all that apply:
- library instruction in the school media center
- library instruction in the classroom
- library instruction in a college or university library
- computer literacy class
- helped by a parent
- helped by a librarian in a school media center
- helped by a librarian in a public library
- helped by a librarian in a college or university library
- helped by a peer (classmate or friend)
- taught myself
- other (please describe).
[ back to top]
From the two classes, 27 (51 percent) out of 53 students chose to participate in the study. Most of these were juniors or seniors, and all were taking the class as an elective. Beyond that, no additional demographic data were collected. Although no objective, standardized instrument was used to measure information literacy skills, results on parts 1 and 2 of the assignment—a self-generated information-seeking task and an imposed information-seeking task—suggest that many of these students had deficits in their skills. In general, students consulted multiple sources only when explicitly instructed to do so, as was the case with the imposed task. With the self-generated task, they relied heavily on Web search engines—and often went no further. Twenty (74 percent) used a Web search engine, and 12 (44 percent) of those used nothing else. Typically, for the self-generated task, the sole criterion students used in evaluating a source was relevance, i.e., whether or not it answered their question or related to their topic. Twenty-three (85 percent) identified that as a criterion, while only 5 (19 percent) mentioned authority or reliability as a criterion they considered. By comparison, for the imposed query, 22 (82 percent) identified relevance as a criterion they used, while 16 (59 percent) mentioned authority or reliability. But for the imposed task other kinds of deficiencies were observed. Ten (40 percent) of the 25 students who submitted the bibliography had trouble finding the requisite numbers of source types—two Web sources, two database sources, and two print sources. Yet, of these 10 students, 7 (70 percent) reported being satisfied or very satisfied with their performance on the task; 2 (20 percent) said that they were not sure, and only 1 (10 percent) reported being dissatisfied. All of the 25 students made significant errors in putting their sources in correct APA format. Errors included not putting the list of sources in alphabetical order and not using correct citation format for Web, electronic database, and print sources. Yet 19 (76 percent) students reported being either satisfied or very satisfied with their performance while 3 (12 percent) were unsure and only 2 (8 percent) were dissatisfied.
The most frequent answer to the question, “Please describe any instruction you’ve had in how to use the library or search for information online” was “taught myself” (16, or 59 percent). The second most frequent response was “computer literacy class” (15, or 56 percent). A number of students indicated that they had received instruction in a class of some type. Eight (30 percent) reported having received instruction in the library media center, while 9 (33 percent) reported having received instruction in an academic library and 9 (33 percent) in a regular (nonlibrary) class.
Other than “self,” the nonlibrarians identified as having provided help in the students’ learning of information literacy skills included peers (classmates and friends) (10, or 37 percent) and parents (3, or 11 percent). Librarians were also identified as having provided assistance. Five (19 percent) students reported having learned information literacy skills from a library media specialist, 5 (19 percent) from a public librarian, and 4 (15 percent) from an academic librarian. Given the option of selecting “other” and describing what that might have been, no students chose this answer. The findings are summarized in table 1.
In a completely open-ended question, students were asked to share their reason for enrolling in the course. Although a variety of responses were given, including the convenience of taking an online course and needing an elective, most students by far focused on what they thought they might learn from the course. Fifteen students (56 percent) stated that they wanted to expand their knowledge base, and 14 (52 percent) said that they enrolled in the course because they were interested in the subject matter. Eleven (41 percent) felt that the course would be beneficial to their future career and success. Only 4 (15 percent) said that they took the course specifically because they hoped to improve their research skills.
[ back to top]
In the second research study, first-semester college freshmen at Florida State University were recruited to take the Information Literacy Test (ILT), a Web-based, 60-item, multiple-choice test developed by researchers at James Madison University (Wise et al. n.d.). Students were recruited through an e-mail solicitation from the top 25 percent and the bottom 25 percent of the incoming class, based on admissions criteria—high school grade point average and standardized admissions test score (either the ACT score of the adjusted SAT score). The intention was to try to recruit high academic achievers and low(er) academic achievers, although the researchers held no preconception that previous academic achievement would necessarily correspond to performance level on the ILT.
The ILT was selected as the information literacy assessment instrument because it focuses on information literacy, rather than on information and computer literacy, and is based on four of the five Information Literacy Competency Standards (ACRL 2000). Standard 4, “The information literate student, individually or as a member of a group, uses information effectively to accomplish a specific purpose,” is not assessed. The ILT provides individual, rather than aggregate, scores, has been validated and tested for reliability, and has been adopted by a number of other universities (Cameron, Wise, and Lottridge 2007). In addition, the administrators of the ILT are able to provide a response time analysis per question per student, which they can then compare to benchmark response times to determine whether each student is spending a reasonable amount of time on each question to be considered giving reasonable effort. The developers of the ILT define three skill levels on the basis of the score: 54 (90 percent) or higher is considered advanced, 39 to 53 (65 percent to 88 percent) is proficient, and 38 (63 percent) or lower is nonproficient (Wise et al. n.d.).
Fifty-one students responded to the e-mail solicitation and agreed to participate in the study. All participants were provided compensation in the form of gift cards to the university bookstore. As an added incentive to try to do well on the ILT, students who scored in the top 15 percent were eligible for a lottery to receive one of four additional gift cards. Participants took a brief Web-based survey before taking the ILT and another brief survey after taking the test. The pre–ILT survey included the question, “How have you learned what you know about finding information (either in a library, on the Internet, or by other means)?” Students were presented with eight predefined answers and were instructed to check all that apply:
- library instruction in the school library media center
- library instruction in the classroom
- helped by a parent
- helped by librarian in a public library
- helped by a librarian in a college or university library (includes library tour or group instruction)
- helped by a classmate or friend
- taught myself
- library instruction provided during FSU orientation
An additional question asked students to “please describe any other ways that you have learned to use the library or to find information.” All project instruments were administered in a proctored environment at the University’s Center for Assessment and Testing.
[ back to top]
Of the 51 students who participated in the study, 34 (67 percent) came from the top 25 percent of the class (based on admissions criteria) and 17 (33 percent) from the bottom 25 percent. There were 37 (73 percent) females and 14 (27 percent) males. All were eighteen or nineteen years of age. 2
One (2 percent) student scored as advanced on the ILT, 27 (53 percent) scored as proficient, and 23 (45 percent) scored as nonproficient. Of the 34 students from the top quartile of their admissions class, 9 (26 percent) scored as nonproficient; of the 17 students from the bottom quartile, 13 (76 percent) scored as nonproficient. In both quartiles, those who scored as nonproficient greatly overestimated their skills levels both in predictions they made before taking the ILT and in predictions they made immediately after taking the test. In contrast, students who scored as proficient provided both pre– and posttest estimates that were much closer to their actual scores on the test.
In the pretest survey, students were asked how they had learned what they know about finding information, whether in a library, online, or by other means, and they were allowed to provide multiple answers. Overall, 38 (74 percent) reported that they were self-taught, 21 (41 percent) that they had learned from peers, and 15 (28 percent) that they had learned from parents. Twenty (39 percent) indicated that they had been helped by a librarian in a public library, while 12 (24 percent) stated that they had been helped by a librarian in a college or university library. In terms of formal instruction, 23 (45 percent) students had received instruction in a library media center, 13 (26 percent) in a nonlibrary classroom, and 10 (20 percent) during the university’s orientation sessions. In the “other” category, students mentioned having received instruction through working in a library, getting help from a stranger in a library, and participating in a special program at the university.
When the responses are broken out according to level of proficiency (proficient vs. nonproficient), some differences become obvious. Both groups reported that they are self-taught, with 20 (71 percent) of the proficient students selecting that answer and 18 (78 percent) of the nonproficient students selecting it. Seven (25 percent) proficient students stated that they had been helped by parents, while 8 (35 percent) nonproficient students had received parental help. Seven (25 percent) proficient students had received help from peers, while 14 (61 percent) of nonproficient students said that they had. Twelve (43 percent) proficient students had received help from a librarian in a public library; slightly fewer (8, or 35 percent) nonproficient students had. Relatively few students in each group indicated having received help from a college or university librarian, with 6 (21 percent) of the proficient students selecting that answer and 6 (26 percent) of the nonproficient students selecting it. In terms of formal instruction, greater differences were seen. Fifteen (54 percent) of the proficient students indicated that they had received instruction in a library media center, while only 8 (35 percent) of the nonproficient students said that they had. Similarly, 10 (36 percent) of the proficient students reported having received such instruction in a nonlibrary classroom, while only 3 (13 percent) of the nonproficient students had. Relatively few in each group had received information literacy instruction during the university’s orientation (5, or 18 percent, of proficient students and 5, or 22 percent, of nonproficient students). The findings are summarized in table 2.
[ back to top]
The third study also focused on college freshmen at Florida State University, but, unlike study 2, which focused on freshmen at the very beginning of their college career, study 3 focused on freshmen at or near the end of their first year. Twenty students were recruited through an e-mail solicitation to participate in interviews and to take the ILT. The e-mail was sent to the top 10 percent and the bottom 10 percent of the freshman class, based on admissions criteria (high school grade point average and ACT or adjusted SAT score). As with study 2, the intention was to try to recruit both students with proficient information literacy skill levels and students with nonproficient skill levels. Students were given a gift card to the university bookstore as compensation for participating in the interview and were given another gift card after taking the ILT. As an added incentive to try to do well on the ILT, students who scored in the top 15 percent were be eligible for a lottery to receive one of two additional gift cards.
The research consisted of semistructured interviews with each student lasting anywhere from 45 to 60 minutes. Each interview was recorded and later transcribed by a graduate research assistant. Both researchers were present during the interviews, with one asking the questions and the other taking notes. The purpose of the interviews was to determine students’ conceptions of and experiences with information seeking and information literacy, including with imposed and self-generated information-seeking tasks. During the course of the interviews, students were asked, “How have you learned what you know about finding and using information?” Students were also asked, “If you wanted to learn a new skill related to finding and using information, how would you prefer to learn it?” Each student participated in the interview first and then, typically within one week, took the ILT at the University’s Center for Assessment and Testing.
Each researcher coded the interviews and then compared coding. Analysis used the constant comparative method. Test administrators provided ILT scores and also a response time analysis for each question and each student.
[ back to top]
Of the twenty participants in the study, 17 (85 percent) represented the top 10 percent of their class while 3 (15 percent) represented the bottom 10 percent (based on admissions criteria). There were 15 (75 percent) females and 5 (25 percent) males, and almost all were eighteen or nineteen years of age. A variety of majors was represented, including STEM, business/economics, humanities, and music; one student was undecided. Overall, this group of students proved to be competent in their information literacy skill levels. One student (5 percent) scored as advanced, one (5 percent) as nonproficient, and the remainder (18, or 90 percent) as proficient. 3
In the interviews, students indicated a lack of familiarity with the term “information literacy”; nevertheless, the way they described their information-seeking process suggested that they were aware of the key aspects of the concept as defined by information professionals. Many of them, for example, discussed the importance of understanding the topic, identifying and locating relevant sources, evaluating those sources, and using those sources to create a product (such as a research paper), make a decision, or simply satisfy curiosity. Overall, students described information seeking in terms of learning and thinking rather than mastery of computer or library skills. For them, success was defined as finding what they needed to know. Almost all of the students were confident in their information-seeking skills, although they did not feel that they were particularly unique. Several students explained that they were like most people of their generation: They had grown up using computers and were therefore comfortable finding information.
When asked how they had learned what they know about finding information, students reported both formal and informal instruction. Seven (35 percent) indicated that they had had a library class, and one (5 percent) mentioned attending a library term paper clinic. Six (30 percent) students referred specifically to learning about information seeking in a library media center, and 5 (25 percent) of those 6 recalled the library media center experience having occurred during their time in elementary school. Three (15 percent) mentioned having received instruction in a college library. None specifically mentioned having received instruction in a public library. Students also mentioned having received instruction from teachers in how to find information. Thirteen (65 percent) students overall identified teachers as sources of information literacy instruction. Some students identified teachers at specific levels (with some students identifying more than one level). Five (25 percent) stated that they received information literacy instruction from teachers in elementary school, 6 (30 percent) mentioned high school teachers, and 3 (15 percent) mentioned college teachers. Interestingly, none mentioned middle school teachers.
Students also described informal sources of information literacy instruction. Eight (40 percent) identified parents as sources of such training, with six (30 percent) of these identifying their mothers as sources. Three (15 percent) identified other family members, of which 2 (10 percent) identified brothers. Four (20 percent) stated that they had learned about finding information from friends, while only 2 (10 percent) said that they had been instructed informally by librarians. Not surprisingly, 17 (85 percent) described themselves as self-taught, and 4 (20 percent) mentioned specifically that, because they grew up with the Internet, learning about finding information came naturally to them.
When asked how they would prefer to learn new information-seeking skills, one (5 percent) student stated that he or she would like to take a class in the library; 4 (20 percent) said that they would like to learn in a regular (nonlibrary) classroom. Only 2 (10 percent) students preferred to learn from an online tutorial, while 13 (65 percent) said that they would like to learn in a one-on-one, face-to-face situation with someone who already has the skills. Such people included librarians, teachers, relatives, and friends. The one-on-one situation offers several advantages that students mentioned: It facilitates the asking of questions (4, or 20 percent), allows for hands-on practice (3, or 15 percent), and provides a comfortable atmosphere in which to learn (2, or 10 percent). Three (15 percent) people stated explicitly that learning from a friend was easier than learning from someone who might be considered more of an authority figure. The findings are summarized in table 3.
[ back to top]
The students in the three studies demonstrated varying levels of proficiency in information literacy, yet almost uniformly reported confidence in their abilities to find and use information. The juniors and seniors in study 1, for example, expressed satisfaction with their performance on an imposed query, yet many of them had difficulty locating the specified source types and in creating a bibliography in correct APA citation format. Similarly, nearly half (45 percent) of the freshmen in study 2 scored in the nonproficient range on the ILT, although they predicted (both before and after taking the test) that their scores would be much higher. The freshmen in study 3 also expressed confidence in their information literacy skills, and in their case their self-assessments proved to be accurate on the basis of their scores on the ILT. In general, these studies suggest that, in terms of information literacy, confidence is not a reliable predictor of competence. These results support the basic tenets of competency theory, in that those who demonstrated nonproficiency tended to greatly overestimate their performance on either an imposed query or the ILT. By comparison, those who scored as proficient on the ILT expressed confidence in their abilities but did not believe that they had an unusually high skill level. These results also raise the question of how library media specialists (and other librarians) can most effectively address the instructional needs of low performers who most likely do not recognize that they need instruction.
In all three studies, students reported having received instruction in a library media center or from a SLMS. Approximately one-third (30 percent) of the students in studies 1 and 3 identified an SLMS as a source of their information literacy knowledge, while nearly half (45 percent) of the students in study 2 did. Moreover, 54 percent of the students who scored as proficient on the ILT in study 2 reported having received instruction from an SLMS (as compared to 35 percent of the students who scored as nonproficient). One might expect that more students would have identified SLMS’s as a source of their information literacy knowledge, given the great emphasis placed on information literacy competence over the past decade. It may be that some students simply do not remember instruction that they, in fact, did receive. If that is the case, however, it suggests perhaps that the instruction made little impression on them. But, as other studies have shown (for example, Harris 2003), it is also the case that the emphasis placed on information literacy instruction varies considerably from district to district and even from school to school because of a number of factors, including budgets, staffing, and administrative support.
Public and academic librarians were also identified as sources of information literacy knowledge, although in both cases to a lesser extent than SLMS’s. Students in study 2 had been on campus less than one month and therefore had had little opportunity to interact with academic librarians; students in study 3 were mostly second-semester freshmen, but they reported little interaction with academic librarians (only 15 percent). By comparison, the juniors and seniors in study 1 reported more interaction with academic librarians (33 percent). The results suggest that if the students in these studies did not receive information literacy instruction from SLMS’s at the K–12 level, they were no more likely to receive such instruction from another type of librarian.
Of the three study groups, the students in study 2 most frequently identified librarians as sources of information literacy instruction. Forty-five percent stated that they had received instruction from an SLMS, 39 percent from a public librarian, and 24 percent from an academic librarian. (It should be noted that freshmen who attend the university’s orientation are provided with some library instruction at that time.) In comparison, about one-third of the students in study 1 had received instruction from an SLMS and about one-third from an academic librarian, but only one-fifth from a public librarian. Among the participants in study 3, only 15 percent had received instruction from an academic librarian, and none specifically identified a public librarian as a source of instruction. Within the two student populations in study 2 (proficient and nonproficient), the only appreciable difference between the two groups in terms of type of librarian providing instruction occurred among the SLMS’s. Among the proficient students, over half had received instruction from an SLMS, while among the nonproficient only about one-third had.
A number of students in all three studies identified teachers and professors as sources of information literacy knowledge. Responses ranged from approximately one-fourth in study 2 to one-third and two-thirds in studies 1 and 3 respectively. One might surmise that teachers who include a research paper (or presentation) as part of a course also provide some instruction in the research process. None of the studies attempted to identify whether librarians and teachers (at any level) worked collaboratively in developing and delivering information literacy instruction, and, in any case, such collaboration might not have been readily apparent to students.
The students across all three studies identified various informal means by which they had learned information literacy skills. By far the method most frequently mentioned by students was that they were “self-taught,” with 59 percent in study 1, 74 percent in study 2, and 85 percent in study 3. Based on the research literature (for example, Harris 2003), it is clear that information literacy instruction is not uniformly provided in K–12 education. However, while a majority of students reported that they were self-taught, the data we collected does not indicate that students received no other kind of instruction. Iin fact, as discussed above, a number of them reported receiving instruction from librarians and teachers. Among the more informal kinds of instruction sources were peers and parents. Interestingly, students in study 3, all but one of whom scored as proficient on the ILT, identified self (85 percent) and parents (40 percent) as sources of information literacy knowledge more frequently than did the students in the other two studies. By the same token, the students in study 3 identified peers (20 percent) as sources of knowledge only half as frequently as did the students in studies 1 and 2. Perhaps it is not surprising, then, that the students in study 3 stated that they would prefer to learn new skills in a one-on-one situation where they felt comfortable to ask questions and where they were given the opportunity for hands-on practice.
[ back to top]
The relatively small number of students in each of the studies described here means that the findings are not generalizable to the population of college students overall. In addition, the participants were all students at a Research I state university with a competitive admissions process. All of the students therefore demonstrated academic achievement in high school and on standardized admissions tests. These students fall within a fairly narrow range of academically successfully students and cannot be considered representative of other kinds of college students. Students in community colleges, for example, with open-access admissions policies represent a much wider range of academic backgrounds and abilities. Finally, each of the studies described here depended on the self-reporting of remembered instruction. Thus participants’ responses are subject to the limitations of possibly inaccurate memories and selective self-reporting.
[ back to top]
The findings from these three studies, though not generalizable, nevertheless suggest implications for both research and practice. For one thing, competency theory provides a potentially useful way for SLMSs and other educators to analyze and address the needs of low-skilled students. The problem with designing effective instruction may be partly one of motivation, as others have suggested (see, for example, Small 1998, 1999). Motivation often involves convincing students that something is of value and providing a tangible reward for mastery of skill or content. But it may also be that the lack of motivation on the part of low-skilled students stems not from their failure to recognize the value of strong information literacy skills, but rather from their failure (or their inability) to recognize that they have a need to improve their own skill levels. To design interventions for these students, this lack of awareness must be taken into consideration and addressed; then, perhaps, effective motivational strategies can be integrated into interventions. More research is needed to understand the differences between proficient and nonproficient students (see Gross and Latham 2007 for a full discussion of the differences between the proficient and nonproficient students in study 2). Research building on the preliminary results of these three studies is also needed to understand more fully the implications of this work for designing effective information literacy instruction.
The findings from these three studies also suggest that students do in fact recall receiving instruction from SLMSs and other librarians in improving information literacy skills, although perhaps not in the numbers we would like to see. It may be that, as has been suggested by several researchers (see, for example, Burhanna and Jensen 2006, Carr and Rockman 2003, Ercegovac 2003, and Nutefall 2001), more collaboration among SLMSs, public librarians, and academic librarians can help to provide a consistent program of instruction to students in various library environments. By the same token, more collaboration between SLMSs and teachers can lead to positive and consistent reinforcement of what is being learned both in the library media center and in the classroom. Moreover, a new conception of information literacy may be needed, one based on students’ experiences with information and information seeking rather than what Macpherson (2004) calls “an information-processing model” based on following discrete steps in a procedure (cited in Budd 2008). Such research might take a phenomenographic approach, one focused on conceptions and perceptions rather than formulaic competencies, following, for example, the work of Bruce (1997) with college administrators and that of Maybee (2006, 2007) and Budd (2008) with college undergraduates. Incorporating student conceptions and perceptions into information literacy instruction might allow SLMSs, other librarians, and other educators to more effectively address the information literacy needs of all students. Clearly future research is needed among K–12 students to inform best practices for information literacy instruction and to ensure that students at each level are prepared to advance to the next level, to be successful academically, and to be lifelong learners.
[ back to top]
The authors would like to thank the Florida State University Council on Research and Creativity for providing a Planning Grant for the completion of study 2, and the Online Computer Library Center (OCLC) and the Association for Library and Information Science Education (ALISE) for providing an OCLC/ALISE Library and Information Science Research Grant for the completion of study 3. The authors also gratefully acknowledge the assistance of the following doctoral students in the College of Information at Florida State University: Debi Carruth, Annette Goldsmith, James Hernandez, and Joung Hwa Koo.
[ back to top]
1. Some examples of such assessment tools include Educational Testing Service’s iSkills, Project SAILS’s Standardized Assessment of Information Literacy Skills, and James Madison University’s Information Literacy Test (discussed in the text).
3. Complete findings from Study Three are discussed in Gross and Latham 2008.
[ back to top]
American Association of School Librarians (AASL). 2007. School libraries count! A national survey of school library media programs 2007. www.ala.org/ala/aasl/slcsurvey.cfm (accessed Aug. 12, 2008).
———. 2008. Standards for the 21st-Century Learner. www.ala.org/ala/aasl/aaslproftools/learningstandards/standards.cfm (accessed Sept. 2, 2008).
American Association of School Librarians (AASL) and Association for Educational Communication and Technology (AECT). 1998. Information Power: Building partnerships for learning. Chicago: ALA.
Association of College and Research Libraries (ACRL). 2000. Information literacy competency standards for higher education. www.ala.org/ala/mgrps/divs/acrl/standards/informationliteracycompetency.cfm (accessed Nov. 12, 2008).
Bruce, C. 1997. The seven faces of information literacy. Adelaide: Auslib Pr.
Budd, J. M. 2008. Cognitive growth, instruction, and student success. College & Research Libraries 69: 319–30.
Burhanna, K. J., and Jensen, M. L. 2006. Collaborations for success: High school to college transitions. Reference Services Review 34: 509–19.
Cameron, L., Wise, S. L., and Lottridge, S. M. 2007. The development and validation of the Information Literacy Test. College & Research Libraries 68: 229–36.
Carr, J. A., and Rockman, I. F. 2003. Information-literacy collaboration: A shared responsibility. American Libraries 34(8): 52–54.
Educational Testing Service. 2008. iSkills. www.ets.org (accessed Sept. 2, 2008).
Ercegovac, Z. 2003. Bridging the knowledge gap between secondary and higher education. College & Research Libraries 64: 75–85.
Foster, A. L. 2006. Students fall short on “information literacy,” Educational Testing Service’s study finds. Chronicle of Higher Education 53.10: A–36.
Gross, M. 1995. The imposed query. RQ 35: 236–44.
Gross, M., and Latham, D. 2007. “Attaining information literacy: An investigation of the relationship between skill level, self-estimates of skill, and library anxiety.” Library & Information Science Research 29: 332–53.
———. 2008. Self-views of information seeking skills: Undergraduates' understanding of what it means to be information literate. Report to the Online Computer Library Center and the Association for Library and Information Science Education.
Harris, F. J. 2003. Information literacy in school libraries: It takes a community. Reference & User Services Quarterly 42: 215–23.
Islam, R. L., and Murno, L. A. 2006. From perceptions to connections: Informing information literacy program planning in academic libraries through examination of high school library media center curricula. College & Research Libraries 67: 492–514.
Kruger, J., and Dunning, D. 1999. Unskilled and unaware of it: How difficulties in recognizing one’s own incompetence can lead to inflated self-assessments. Journal of Personality and Social Psychology 77: 1121–34.
Lance, K. C., and Loertscher, D. V. 2003. Powering achievement: School library media programs make a difference: The evidence mounts. 2nd ed. Salt Lake City, Utah: Hi Willow.
Macpherson, K. 2004. An information processing model of undergraduate electronic database information retrieval. Journal of the American Society for Information Science and Technology 55: 333–47.
Maybee, C. 2006. Undergraduate perceptions of information use: The basis for creating user-centered student information literacy instruction. Journal of Academic Librarianship 32: 79–85.
———. 2007. Understanding our student learners: A phenomenographic study revealing the ways that undergraduate women at Mills College understand using information. Reference Services Review 35: 452–62.
Nutefall, J. E. 2001. Information literacy: Developing partnerships across library types. Research Strategies 18: 311–18.
Peter D. Hart Research Associates and Public Opinion Strategies. 2005. Rising to the challenge: Are high school graduates prepared for college and work? A study of recent high school graduates, college instructors, and employers. www.achieve.org/node/548 (accessed Aug. 12, 2008).
Project SAILS. 2007. Standardized assessment of information literacy skills. www.projectsails.org (Sept. 2, 2008).
Small, R. V. 1998. Designing motivation into library and information skills instruction. School Library Media Quarterly Online 1. www.ala.org/ala/aasl/aaslpubsandjournals/slmrb/slmrcontents/volume11998slmqo/small.cfm (Aug. 27, 2008).
———. 1999. An exploration of motivational strategies used by library media specialists during library and information skills instruction. School Library Media Research 2. www.ala.org/ala/aasl/aaslpubsandjournals/slmrb/slmrcontents/volume21999/vol2small.cfm (Aug. 27, 2008).
Smalley, T. N. 2004. College success: High school librarians make the difference. Journal of Academic Librarianship 30: 193–98.
Wise, S. L., et al. n.d. Information Literacy Test: Test development and administration manual. Harrisonburg, Va.: James Madison University.
[ back to top]