Examining Information Problem-Solving, Knowledge, and Application Gains within Two Instructional Methods: Problem-Based and Computer-Mediated Participatory Simulation
Terrance S. Newell, PhD, Assistant Professor, Schools of Information Studies, University of Wisconsin–Milwaukee.
This study compared the effectiveness of two instructional methods—problem-based instruction within a face-to-face context and computer-mediated participatory simulation—in increasing students’ content knowledge and application gains in the area of information problem-solving. The instructional methods were implemented over a four-week period. A two-group, pretest–posttest, randomized control design coupled with an independent sample t-test on learning group gains was used to examine effectiveness. The results of this study show that the simulation group experienced significant overall (which refers to combined knowledge and application abilities) gains from pre–implementation to post–implementation (T = 1.852, p = .04*); however, as we divide overall development into its constitutive parts, the results suggest that the two instructional methods may have distinctly different affordances.
The following statement, “immersive spaces are good for learning,” is an emerging academic position and area of research (Shaffer et al. 2005; Gee 2005; Shaffer 2008; Aldrich 2005) that has the potential to augment teaching and learning practices within many K–12 subject areas, including information problem-solving instruction. As an academic position, the view of immersive environments (e.g., video games, computer games, simulations, and desktop virtual reality environments) within the instructional landscape is theoretically persuasive. Within this position, immersive technologies are viewed as potentially powerful learning spaces that could be integrated into the teaching and learning context. In other words, virtual worlds could be integrated into the education landscape if they are developed using immersive technology and focused on learners solving complex problems within a community of practice (Shaffer et al. 2005). This academic position accords well with the relatively recent move toward authentic disciplinary practices—a sociocultural view of teaching and learning reflected in many journal publications and standards documents such as the National Council of Teachers of Mathematics (2000) and the National Research Council (1996)—that refer to an approach to student learning that requires them to engage the practices of specialists within disciplines (e.g., historians, scientists, mathematicians, and information specialists) instead of simply learning the facts associated with those disciplines (Ford and Forman 2006). In concert with the disciplinary-practices view of teaching and learning, it is argued that complex and immersive teaching environments could augment learning by presenting new social and cultural worlds that allow students to participate in new social practices and take on new identities (Shaffer et al. 2005).
As an area of research, empirical evidence is needed to illuminate the effectiveness of these environments within the K–12 IL instructional landscape. This study compared the effectiveness of a face-to-face, problem-based instruction method to a computer-mediated participatory simulation ( see figure 1). The problem-based method uses instructional opportunities that require students to learn information problem-solving by engaging real-world, complex problems with uncertain and multiple information solutions (Blumenfeld et al. 1991). The participatory simulation method requires students to develop subject-specific literacies through simulated experiences, interactions, and communities of practices. Moreover, within the simulation method, students learn problem-solving practices by actively participating as information literacy (IL) apprentices within a computer-generated community of information professionals.
[ Back to Top]
The development of computer information systems, ideas of an information-based society, and an alarming report (National Commission on Excellence in Education 1983) on the status of American education created a dynamic stage-setting for the development of a K–12 IL discourse. Within that setting, library and information science (LIS) scholars and information organizations worked to develop the discourse within the K–12 educational landscape. While the list of discursive contributions is very long, there are some seminal moments during the discursive development of IL. For example, research conducted by Irving (1985) found that educators valued skills needed to learn from information resources more than skills associated with the retrieval of resources. Scholars (e.g., Mancall, Aaron, and Walker 1986; Craver 1989) created a nexus between the concept of IL and the skills tied to critical thinking. Carol Kuhlthau (1987) illuminated the historical developments of the information literacy discourse and encouraged school library media specialists (SLMSs) to make integrated IL instruction a primary aspect of their programs. Breivik, Hancock, and Senn (1998) promoted the shift from a technology-focused IL discourse to a skill-focused discourse. The ALA Presidential Committee on Information Literacy articulated a definition of IL and recommendations for furthering the concept within areas such as primary and secondary education (ALA 1989). The National Forum on Information Literacy (NFIL)—a coalition of education, business, and government organizations—formed to advance the concept within various domains of study and practice, including K–12 education. A Delphi study conducted by NFIL members tied the concept of IL to national education goals, and Doyle (1994) further articulated the possibilities of IL within national goals. In a position statement, the Wisconsin Education Media Association (WEMA), endorsed by the American Association of School Librarians (AASL), positioned information problem-solving content and practices (i.e., identifying tasks, employing search strategies, locating resources, accessing information physically and intellectually, interpreting information, communicating information, and evaluating the problem-solving product and process) as the central focus of IL teaching and learning on the K–12 level (WEMA and AASL 1993), and the AASL collaborated with the Association for Educational Communications and Technology (AECT) to publish IL standards (AASL and AECT 1998).
Information problem-solving instruction refers to the teaching and learning of practices related to the creation of information solutions to problems. The framework that underpins the concept is a process-oriented one, and a dominant theme that runs through the concept is the coupling of practices needed to access and use information with those needed to apply and solve information problems (Wolf, Bush, and Saye 2003). The constitutive practices of information problem-solving are identifying tasks, employing search strategies, locating resources, accessing information physically and intellectually, interpreting information, communicating information, and evaluating the problem-solving product and process. Task identification (also known as problem definition) refers to a practice of recognizing the existence of an information-based problem and defining the needs associated with that problem (WEMA and AASL 1993). Search strategy initiation refers to the development of a plan that will be employed to find information. Information location refers to an ability to find resources within information landscapes and information within particular resources. Information evaluation refers to the determination of information accuracy, comprehensiveness, relevance, and usefulness. Information use refers to the integration and synthesis of information to solve a defined problem. Information communication refers to the effective presentation of problem-solving resolutions. Problem-solving product and process evaluation refers to a critical assessment of the final resolution of a problem and the processes employed in generating it.
Integrated IL Instruction
The AASL promotes the full integration of IL skills within the teaching and learning landscape (AASL and AECT 1998), and presently, SLMSs are primarily employing integrated approaches to information problem-solving instruction, which are collaborative efforts between SLMSs and subject area educators to contextualize information problem-solving content and practices within ongoing classroom instruction (Thomas 2004). Furthermore, since information problem-solving has a process orientation, collaborating educators have traditionally used instructional methods (e.g., resource-based learning, project-based learning, and inquiry learning) within integrated approaches (Eisenberg, Lowe, and Spitzer 2004; Thomas 2004). AASL strongly promotes the idea of “library media specialists work[ing] with teachers to plan, conduct and evaluate learning activities that incorporate information literacy” (AASL and AECT 1998, 50), and much of the K–12 research literature (e.g., Irving 1985; Pitts1995; Thomas 2004; Todd 1995) seems to support the idea of integration. For instance, Todd (1995) examined the use of integrated IL instruction with middle school science students and found a positive effect on learning. Bingham (1994) compared an integrated information skills approach with a traditional [noncontextualized] approach and reported significantly higher scores for students learning within the integrated approach. Pitts (1995) found integrated instruction necessary because complex information assignments and activities require students to use multiple domains of knowledge including subject matter and IL knowledge. Hara (1996) found integrated IL instruction more effective than both no instruction and noncontextualized instruction.
It is currently best practice to use process frameworks in conjunction with integration and instructional methods (Thomas 2004). Process frameworks are user-centered, cognitive frameworks that focus on strategies for thinking during research and problem-solving activities (Thomas 2004). Examples of process frameworks include the Inquiry Model (Sheingold 1986), Information Search Process (Kuhlthau 1988), Big6 (Eisenberg and Berkowitz 1990), React (Stripling and Pitts 1988), Pathways to Knowledge (Pappas and Tepe 1997), and I-Search (Joyce and Tallman 1997). Scholars like Eisenberg and Brown (1992) propose that there are more similarities than differences among the frameworks because they all guide students through an iterative process of solving an information-based problem or research project.
The general benefits of integrated IL instruction have been articulated through both anecdotal and empirical evidence. However, postsecondary research is indicating a lack of undergraduate IL competencies and abilities. Moreover, studies are indicating a lack in students’ information confidence (National Center for Postsecondary Improvement [NCPI] 2001), ability to formulate and focus research questions (Fitzgerald 1999; Quarton, 2003), ability to access information (Intersegmental Committee of the Academic Senates [NCAS] 2002; NCPI 2001; Quarton 2003), ability to evaluate information (ICAS 2002; Dunn 2002; Fitzgerald 1999), ability to organization information, and ability to create and communicate information (Dunn 2002). Emerging postsecondary research indicates that much remains to be done in the area of IL instruction. In other words, there is a continuous need for the interrogation and augmentation of current best practices, particularly integrated instructional methods. More empirical evidence is need to illuminate the affordances and effectiveness of dominant integrated instructional methods (such as problem-based learning), and empirical interrogations of dominant methods as compared to emerging methods (e.g., computer-mediated methods) are also needed.
Immersive Instructional Methods: An Emerging School of Thought and Research
Immersive instructional methods have the potential to expand the range of instructional practices within the area of K–12 IL education (Shaffer et al.2005; Gee 2005; Shaffer 2007; Aldrich 2005). Immersive instruction refers to the coupling of learning theory, domain-specific knowledge, practices, and technology (e.g., video games, computer games, simulations, and desktop virtual reality environments) to create powerful learning environments (virtual worlds). These virtual learning environments allow students to learn new social practices (e.g., information-literate practices) and develop new identities (e.g., information-literate identities) through the process of solving complex problems and participating in communities of practice (Shaffer et al. 2005).
Research studies (e.g., Randel et al. 1992; Dempsey et al. 1994; Emes 1997; Harris 2001; Lee 1999; Rosas et al. 2003; Laffey et al. 2003; Rieber 1996; Gorriz and Mediana 2000; Prensky 2001) give insight into the affordances of immersive instruction (e.g., electronic gaming and use of simulations) within the education landscape. Randel et al. (1992) examined empirical research studies in the area of gaming and education that were published between 1963 and 1991. This review of research strictly focused on studies comparing the instructional effectiveness of games to traditional classroom instruction, and the researchers found that thirty-eight of the sixty-seven studies concluded that teaching via gaming was just as effective as teaching using traditional practices. The researchers also found that twenty-seven of the sixty-seven studies concluded instruction-as-gaming to be more effective then instruction within traditional practices. Other empirical research studies show that gaming approaches are just a effective as traditional teaching approaches in teaching basic math and reading comprehension skills (Rosas et al. 2003; Laffey et al. 2003), mathematical problem-solving skills (Van Eck and Dempsey 2002), basic logic (Costabile et al. 2003), geographical content knowledge (Wiebe and Martin 1994; Virvou, Katsionis, and Manos 2005), and vocabulary skills (Malouf 1988). The gaming and learning literature has also illuminated particular affordances associated with these learning environments such as the development of critical thinking skills (Rieber 1996), problem-solving skills (Rieber 1996; Gorriz and Mediana 2000; Prensky 2001), visual and spatial skills (Greenfield et al. 1994), cognitive strategies (Gredler 1996), discovery learning (Prensky 2001; Provenzo 1992), student engagement and interactivity (Malone and Lepper 1987; Rosas et al. 2003; Price 1990), visual skills attainment, motor skill growth, and computer-using skills.
Research also indicates that desktop virtual reality and computer simulation environments can improve the development of subject area knowledge and process skills. For example, Joseph Akpan and Thomas Andre (2000) conducted a study to examine the use of a frog dissection simulation—as compared to traditional approaches—in improving students’ knowledge of morphology and anatomy. The researchers found that the simulated dissection group gained significantly more anatomy knowledge, and the researchers concluded that computer simulations are effective learning environments that afford students the opportunity to “search for meaning, appreciate uncertainty and acquire responsibility for their own learning” (Akpan and Andre 2000, 311). Geban, Askar, and Ozkan (1992) studied the effectiveness of three instructional approaches (computer simulation, problem-solving, and traditional) in augmenting students’ chemistry knowledge, science process skills, and attitudes toward chemistry. Researchers found that the problem-solving and computer simulation approaches were significantly more effective than the traditional approach. Woodward, Carnine, and Gersten (1988) focused on the effectiveness of a computer simulation as compared to a structured teaching approach in teaching health knowledge to mildly handicapped students. The researchers found that the computer simulation group significantly outperformed the structured teaching group in respect to basic facts, concepts, and problem-solving skills, with the greatest difference in the problem-solving area. David Ainge (1996) conducted a pretest–posttest, two-group, virtual reality study using the VREAM Virtual Reality Development System as compared to a conventional instructional method in developing students’ abilities (i.e., visualization, naming, and recognizing). The researcher found that the VR learning environment had a slight positive effect on shape visualization and name writing; however, the VR environment strongly enhanced the student’s ability to recognize shapes in everyday contexts. Rivers and Vockell (1987) conducted a study to determine if a computer simulation could be used to enhance high school students’ problem-solving abilities in the area of biology. The study had three groups: (1) unguided discovery, which provided students with only a brief introduction to the simulation before use; (2) guided discovery, which provided students with an in-depth introduction and strategies to use in solving simulated problems; and (3) a control group, which were taught the same topics using lectures, textbooks, and laboratory exercises. Results indicated that students using the unguided discovery simulation preformed as well as the control group; however, the students using the guided discovery simulation significantly outperformed all other students in respect to scientific thinking and critical thinking abilities. Paul Kelly (1998) conducted a study to determine if students learning within a simulation environment could transfer knowledge and perform as well as students that received instruction in a laboratory setting. The researchers found that the computer simulation students performed as well as the laboratory students.
This emerging school of thought—that immersive spaces are good for learning—and the technologies tied to it should be empirically explored within the area of K–12 IL instruction.
[ Back to Top]
Purpose of the Study
This study compared the effectiveness of a face-to-face, problem-based instruction method to a computer-mediated participatory simulation. The problem-based method utilizes instructional opportunities that required students to learn information problem-solving by engaging real-world, complex problems with uncertain and multiple information solutions (Blumenfeld et al. 1991). The participatory simulation method requires students to develop subject-specific literacies (e.g., information problem-solving) through computer-simulated experiences, interactions, and communities of practices.
Three research questions were generated for this study:
- Are the overall gains of the two groups (problem-based and simulation) significantly different after a four-week implementation period?
- Are the content knowledge gains of the two groups (problem-based and simulation) significantly different after a four-week implementation period?
- Are the application gains of the two groups (problem-based and simulation) significantly different after a four-week implementation period?
The first hypothesis, which is based upon previous educational research using computer simulations and virtual reality platforms, stated that the participatory simulation group (PS) would display significantly larger overall gains from pretest to posttest (H1: problem-based group [PB] < [PS]). The second hypothesis stated that the PS would display significantly larger content knowledge gains from pretest to posttest (H2: PB knowledge gain < PS knowledge gain). The third hypothesis stated that the PS would display significantly larger application gains for pretest to posttest (H3: PB application gain < PS application gain).
[ Back to Top]
Instructional Methods Used in the Study
Computer-Mediated Participatory Simulation
A three-dimensional, computer generated, participatory simulation was employed as an instructional method for problem-solving teaching and learning ( see figure 1). The simulation, which is constructed upon sococultural learning principles, was the result of a rapid design ethnography that attempted to extend instruction role possibilities for SLMSs (Newell 2004).
The 3D interface of the simulation was developed using authorware similar to the design tools used within the online virtual world call Second Life. The simulation interface was highly interactive and allowed students to navigate the 3D environment from every perspective using the computer screen, mouse, and keyboard. The 3D simulated contexts consisted of a middle school library ( see figure 2), high school library ( see figure 3), informal information environments ( see figure 4), and electronic environments ( see figure 5). The various contexts were designed to represent a small community or town; in other words, students could virtually walk from the middle school library to the high school library within the 3D simulation. Furthermore, the 3D simulation technology enabled the construction of virtual information objects, artifacts, and resources (e.g., books, computers, televisions, and people). Both the virtual objects and contexts were interactive and responded to the participants’ actions; for example, the virtual (simulated) computers worked and students could virtually use many of the books within the space. Within the 3D simulation environment, students could move and interact freely and collaboratively using avatars, communicate using chat features and gestures, and use a variety of information objects, artifacts, and resources.
Within the participatory simulation, students learned information problem-solving practices through simulated experiences, interactions, and communities of practices. Moreover, students learned problem-solving practices by actively participating as IL apprentices within a computer-generated community of information professionals (called Cybrarians). As IL apprentices, students consulted the simulated information professionals and assisted them in meeting the information needs of other computer-generated characters within the simulation using novice problem-solving knowledge and practices (Wenger 1998). All information-oriented problems emerged from computer-generated characters needing help, and through participation students’ learned information practices—gradually developing from novices to masters. Moreover, student learning was guided using three techniques: scaffolds, communities of practice, and cognitive process frameworks. First, instructional scaffolding helped students as they learned through active participation, and these learning supports (scaffolds) existed in the form of tutorials, information on-demand, just-in-time pop-ups (Gee 2007), coaching, modeling (Collins, Brown, and Newman. 1989), cognitive structuring (Gallimore and Tharp 1990), exploration (Collins, Brown, and Newman1989), and questioning (Gallimore and Tharp 1990). Second, a community of practice (CoP) also helped the development of practices. Information professionals—a group that is bounded by best practice approaches to information problem-solving—constituted the CoP, and within the 3D simulation, students could work with members of this CoP (e.g., Cybrarians, real-world librarians, and other student apprentices) to solve problems. Third, as students engaged tasks, they used a cognitive process framework to cognitively structure the stages of information problem-solving and thinking strategies related to the different stages. The stages were task identification, search strategy initiation, information location and access, information evaluation, information use, information communication and problem-solving product and process evaluation.
The initial development of all simulation tasks was done by the researcher. A variety of information literacy and information problem-solving textbooks and articles were analyzed and used in the construction of tasks. However, the SLMS and the technology teacher, within the research site, reviewed and augmented all tasks a month before the study was conducted. Tasks were also added by the educators during this period.
Problem-Based Instruction within a Face-to-Face Context
As stated above, it is currently best practice to teach information problem-solving using an integrated instructional approach coupled with an instructional method and process framework (Thomas 2004; AASL and AECT 1998). Staying true to the integrated approach, the technology teacher and information specialist decided to design a technology-focused unit of study that embedded information problem-solving lessons into the class activities. The title of the unit was Applied Computer Skills, and the general purpose was to overview how technology can be used in everyday life. The educators, surprisingly, began the construction of this unit by examining the participatory simulation tasks to determine the primary technological applications and problem-solving processes used. For example, many simulation tasks required students to search for websites and watch and listen to online videos and search catalogs; therefore the educators designed the unit with daily lessons focusing on those technologies and processes. Second, the educators took the 3D simulation tasks and restructured them into problem-based learning situations to be used as integrated classroom activities that supported the daily lesson. The educators decided to use the simulation tasks to ensure that students within both learning environments experienced the same content and similar activities.
The educators employed the use of a problem-based instructional method and a generic process framework. Within the problem-based method the educators designed instructional opportunities that required students to learn information problem-solving by engaging real-world, complex problems with uncertain and multiple information solutions (Blumenfeld et al. 1991). These instructional opportunities emphasized student-driven question formulation, multimodal information searching, use of information artifacts (e.g., books and computers), sense-making, and multimodal information use (Sheingold 1986; Blumenfeld et al. 1991; Callison 1986). Furthermore, all instructional opportunities culminated in concrete demonstrations of problem-solving practices (Blumenfeld et al. 1991; Callison 1986). During instruction, the technology teacher and information specialist structured, supported, and guided student-constructed understandings and products (Blumenfeld et al 1991; Callison 1086), and as students engaged tasks, students used a generic process-oriented framework that guided them through the stages of information problem-solving and thinking strategies related to the different stages. The method also infused aspects of inquiry-learning. For example, students were viewed as self-reflective learners in a meta-cognitive sense, which refers to “knowing about [himself or herself] and other people as knowers, knowing about the task to be undertaken, knowing what strategies to apply to the task, and how to monitor one’s own performance with respect to the task” (Sheingold, 1986, 84). The students are also viewed as active learners that are motivated by the real-world nature of problems and multimodal information contexts and learners that actively use prior knowledge in the personal construction of information solutions to problems (Sheingold 1986). Within the instructional method, students were also required to work in teams of two during the completions of tasks (Callison 1986; Bransford and Stein 1993). The face-to-face teaching and learning context consisted of a computer lab, middle school library, and all of the information objects and tools within them (e.g., library catalogs, books, computers, websites, and search engines). The middle school, library media center, and computer lab were connected, and the computer lab housed enough computers for each student to have his or her own during the learning periods.
[ Back to Top]
Research Design and Methods
The design consisted of a two-group, pretest–posttest, randomized, controlled trial employed to examine the effectiveness of the two instructional methods. Two seventh grade and two eighth grade computer literacy classes (54 students total) were assigned to instructional methods (problem-based or participatory simulation) through a randomization process. The researcher attempted to recruit the entire seventh and eighth grade population of computer lab students for the study (77 students); however, 23 students were not enrolled or excluded from the study due to a lack of student or parent consent or excessive absences during the four-week implementation period. The participatory simulation group (PS) was composed of 27 students (12 seventh grade and 15 eighth grade students), and the problem-based group (PB) was also composed of 27 students (11 seventh grade and 16 eighth grade students). Both groups were given a pretest, then experienced four weeks of instruction and tested again (posttest).
Pretests and Posttests
Because of the particular aims of the study, it was necessary to develop the test that was to be given as a pretest and posttest. The tests were constructed using the concept of content validity. Content validity refers to the extent to which a measurement instrument is a representative sample of the content domain being measured (Robson 2002). This form of validity is usually employed when a researcher attempts to assess achievement in a knowledge area (Robson 2002). The researcher used Information Power (AASL and AECT 1998) and other seminal information problem-solving publications such as The Definitive Big6 Workshop Handbook (Eisenberg and Berkowitz 2003) to ensure that the test items were reflective of the content knowledge and skills of the information problem-solving domain. After designing and developing the test, the researcher submitted it to a panel of five information educators to both take the test themselves and to provide feedback for augmentation and change. The final version of the test was composed of two sections: (1) a section with ten best-answer questions that overviewed information problem-solving knowledge and (2) a section that required students to complete an information problem-solving activity ( see appendix A). The problem-solving activity was designed to illuminate students’ abilities in solving a problem using information actions and processes such as their ability to locate, evaluate, synthesize, and use information. A rating scale—designed by the researcher and augmented by the same panel of information educators—was used to grade both portions of the test.
The Testing Situation
The students arrived to the computer lab, and they were instructed to turn off all computers. The tests were then distributed, and students were asked to print their names on the first page, which was the best-answer portion of the test. The researcher overviewed the two sections of the test for students and provided an initial reading of the directions. Students were then given the entire class period to complete the test and were reminded that they would need their computers for the problem-solving activity section of the test. The students were permitted to ask questions; however, the teacher and researcher could not define domain-specific terms, ideas, procedures, or processes. Furthermore, they could not provide answers or tips.
Grading the Pretest and Posttests
After all tests were completed, the researcher assigned a unique number to each test. This unique number was written in two places. First, it was placed directly beside the student’s name, and it was also placed on the third page of the test. The first page of the test (the best-answer portion) was then graded and removed to ensure a blind review. The researcher also graded the problem-solving activity portion of the tests, and the SLMS and technology educator were asked to review all assigned problem-solving grades for rater reliability purposes. The goal was to achieve 100 percent agreement on grade assignment; therefore all rating discrepancies were discussed and agreement was achieved through rating clarifications or grade modifications. ( See appendix B for the scoring guide.)
An independent sample t-test on gains was used in this study, and three hypotheses were formulated and tested. Overall, knowledge and application gains from pretest to posttest were calculated. The gain means between the two groups were then compared using t-tests. The test statistic—the T score—was calculated using SPSS software. The probability that the null hypothesis is true (the p-value) was determined on the basis of the T score and halved to obtain the p-value for a directional hypothesis (or one-tailed test). Finally, the p-value was compared to the predetermined .05 significance level.
[ Back to Top]
Descriptive Statistics: PB Method
The average for the PB pretest score (combined seventh and eighth grade classes) was 58.41 percent, and the PB posttest average (combined seventh and eighth grade classes) was 70.56 percent. The PB seventh grade class averaged 60.82 percent on the pretest and 72.36 percent on the posttest. The seventh grade PB class averaged 36.36 percent on the pretest knowledge-oriented section, and 39.09 percent on the posttest knowledge-oriented section. This group achieved a pretest average of 24.45 percent on the application section and 33.27 percent on the posttest application section.
The PB eighth grade class achieved a pretest average of 56.75 percent and 69.31 percent as a posttest average. Furthermore, the PB eighth grade class averaged 30.38 percent on the pretest knowledge-oriented section, and 40.63 percent on the posttest knowledge-oriented section. This group achieved a pretest average of 26.38 percent on the application section and 28.69 percent on the posttest application section.
Descriptive Statistics: PS Method
The average (combined seventh and eighth grade classes) for the PS pretest score was 58.30 percent, and the PS posttest average was 75.26 percent. Moreover, the PS seventh grade class averaged 58.75 percent on the pretest and 74.92 percent on the posttest. The seventh grade PS class also averaged 36.25 percent on the pretest knowledge-oriented section and 36.67 percent on the posttest knowledge-oriented section. This group achieved a pretest average of 22.5 percent on the application section and 38.25 percent on the posttest application section.
The PS eighth grade class achieved a pretest average of 57.93 percent and 75.53 percent as a posttest average. Furthermore, the PS eighth grade class averaged 37.67 percent on the pretest knowledge-oriented section and 37.33 percent on the posttest knowledge-oriented section. This group achieved a pretest average of 20.8 percent on the application section and 38.2 percent on the posttest application section.
Each of the three hypotheses was tested using independent sample t-test comparing student gains from pretest to posttest.
- Hypothesis 1. The first hypothesis stated that the PS learners would have significantly higher overall mean gain scores from pretest to posttest. Hypothesis 1 was supported. There was a significant difference ( T = 1.852, p = .04*) between the overall PB and PS gains after the 4-week implementation period ( see table 1).
- Hypothesis 2. The second hypothesis stated that the PS learners would have significantly higher mean gains scores on the knowledge section from pretest to posttest. Hypothesis 2 was not supported. There was a significant difference but not in the direction anticipated. The PB learners experienced significantly larger knowledge gains ( T = -3.664, p = .00*) after the 4-week implementation period. Furthermore, data suggests that the PS students made no knowledge gains during the 4-week period ( see table 2).
- Hypothesis 3. The third hypothesis stated that the PS learners would have significantly higher mean gains scores on the application section from pretest to posttest. Hypothesis 3 was supported. There was a significant difference ( T = 3.873, p = .00*) between the PB and PS application gains after the implementation period ( see table 3).
[ Back to Top]
The pretest results for all classes (58.35 percent) suggest that students did not fully grasp the major areas of information problem-solving before instructional implementation. Students averaged 34.93 percent on the section of the pretest that asked them content knowledge questions about information literacy in the context of problem-solving, and their average was worst (23.57 percent) on the section that required them to apply practices such as problem definition, access, evaluation, analysis, synthesis, and use. The pretest results were reflective of K–12 information literature. In particular, the results reflect the problems that the literature has illuminated in respect to middle school and high school students’ ability to define a problem (Loerke 1994; Mark and Jacobson 1995), identify types of information needed (Akin 1998; Pitts 1995), locate information (Chen 1993; Irving 1990; Neuman 1995; Solomon 1994; Todd 1998; Nahl and Harada 1996), and evaluate and select information (Fitzgerald 1999; Irving 1985; Hirsh 1998). The results also support recent studies indicating that students struggle in these areas (e.g., accessing, evaluating, analyzing, synthesizing, and applying information) during their first year of college (Dunn 2002; ICAS 2002). It was quite apparent that at the onset of instructional implementation students struggled in respect to answering questions regarding typical information literacy knowledge in the context of information problem-solving and students struggled in respect to understanding and applying information problem-solving practices (e.g., access, evaluation, and use).
After the four-week implementation period, an examination of mean gain scores between groups suggested that the PS approach was more effective in expanding students’ overall development—which refers to combined knowledge and application abilities—in the area of information problem-solving. Moreover, the PS learners experienced significant ( T = 1.852, p = .04*) overall gains from pretest to posttest, although both instructional methods constituted ideal, high-performing learning systems. However, as we divide overall development into its constitutive parts, which are knowledge expansion and application expansion, then the significant overall differences can be better placed into focus. For instance, the PB students experienced significantly higher gains on the knowledge section of the information literacy test from pretest to posttest ( T = -3.664, p = .00*). In respect to knowledge gains, PS students were simply outperformed by a gain of 7.56. In fact, the data indicates that the PS group made no knowledge gains during the four-week learning period, which means that their means at pretest and posttest were exactly the same. This finding provides some insight into potential affordances tied to face-to-face PB approaches. Moreover, this finding is believed to largely result from the types of teaching and learning that is possible within the real-world PB method.
Within the PB instructional approach, the information specialist and technology teacher engaged in a very high level of direct and dynamic information problem-solving instruction (e.g., lecturing, modeling successful performance, and demonstrating important information problem-solving skills). Furthermore, within the PB approach, the educators did not structure a typical lecture environment in which the teacher takes on the role of a performer and the students play the role of an audience. Instead, both educators and students acted as performers within the instructional environment. The educators were very skilled at (1) questioning students in ways that transformed them into cocreator of the overall lecture message, (2) eliciting questions that helped to deconstruct information problem-solving into its constituent parts, (3) eliciting students’ experiences to illustrate specific practices related to the problem-solving process, and (4) engaging students in practical demonstrations to reinforce content. This combination of dynamic lecturing, modeling, and demonstrating was very effective in teaching students about the components of the information problem-solving process and general IL knowledge.
The PS students are believed to have not made gains in the knowledge expansion area for two primary reasons: educators’ lack of student engagement within the simulation space and lack of time for knowledge development. First, the simulation environment was not designed to replace real-world educators. Instead, the simulation was designed to have a continuous (live) educator presence within the virtual environment to engage students in Socratic questioning, reflective thinking, and process modeling. The Cybrarians (preprogrammed computer-generated librarians) within the simulation environment were not designed to serve as primary educators; instead, they were designed to assist the live educators in sharing the range of knowledge and practices required to solve problems within and across contexts. Although the researcher attempted to engender a sense of educator ownership into the simulation environment by having a period in which the educators augmented the tasks of the environment and by continuously discussing the importance of the live educator’s role within technology-based instruction, educators seemed to feel that they were in direct competition with a computer program. Moreover, it was very common for educators to use the phrase “our students” when speaking about the PB group, and their level of student engagement with the PS group deviated from the researcher’s articulated expectations. The lack of engagement required students to almost exclusively rely on preprogrammed Cybrarians (or librarian-bots) for knowledge and practices. Second, students were expected to learn knowledge of the problem-solving domain overtime through interactions and activity—not in a lecture format. PS students may not have had enough time to gain knowledge of the full range of acceptable and typical content within the information problem-solving domain, which is evident by the fact that they were more likely to incorrectly answer the “all of the above” questions during the posttest.
The PS students achieved significantly higher gains in the area of information problem-solving application from pretest to posttest ( T = 3.873, p = .00*), and this finding provides some insight into potential affordances tied to the use of simulation environments in mediating IL instruction. The simulation emphasized the contexts within which information literate practices occur and the idea of learning within social activity and communities of practice. Moreover, information literacy learning did not occur outside of information contexts and participation in the information practices tied to the information domain. The seemly infinite, complex, 3D, computer-based environment coupled with a compelling storyline that made students a part of a community of practice (information studies) may have encouraged students to develop information processes and practices for the purposes of understanding, navigating, and using the many information environments and artifacts within the space. It may also have engaged the students in a culture of sharing knowledge, practices, processes, and experiences to fulfill their role within the simulation. The PS group emerged as a process-oriented group; they focused more on developing information problem-solving practices. The PB group—on the other hand—was more product oriented; students were focused more on generating a product to be graded than developing problem-solving practices. Furthermore, PB students did not seem to engage information problem-solving practices as needed processes to successfully address the scenarios; instead, they engaged practices as something that they were forced to do. PB students focused on the completion of as many scenarios as possible within the class period, instead of focusing on the full development of their skills and fully addressing the scenarios.
[ Back to Top]
We should not take an either/or approach to IL instruction; instead, we must expand our range of instructional approaches on the basis of effectiveness and affordances. The PB method coupled with a high level of direct and dynamic instruction seems to be very effective in presenting content knowledge, and the PS method coupled with scaffolding, CoP, and process frameworks seems to be very effective in teaching the application of IL practices. Both approaches should be used to develop the type of information-literate students that our field desires. However, we must attempt (via research studies, learning theory, and instruction systems technology) to decrease the affordance gap between the two approaches. For example, the findings of this study suggests that learning within the two approaches is distinct and that the two approaches have the potential to construct distinct types of information-literate learners (i.e., learners skilled in content knowledge and learners skilled in the application of information practices). Research studies that attempt to illuminate dynamic activity patterns within IL learning approaches are needed to give greater insight into student learning and instructional affordances and to generate design recommendations that could decrease the distance between affordances tied to particular instructional approaches.
[ Back to Top]
Ainge, D. 1996. Upper primary students constructing and exploring three dimensional shapes: A comparison of virtual reality with card nets. Journal of Educational Computing Research 14: 345–69.
Akin, L. 1998. Information overload and children: A survey of Texas elementary school students. School Library Media Quarterly. www.ala.org/ala/mgrps/divs/aasl/aaslpubsandjournals/slmrb/slmrcontents/volume11998slmqo/akin.cfm. Accessed Dec. 4, 2008.
Akpan, J. and T. Andre. 2000. Using a computer simulation before dissection to help students learn anatomy. Journal of Computers in Mathematics and Science Teaching 19: 297–313.
Aldrich, C. 2005. Learning by doing: A comprehensive guide to simulations, computer games, and pedagogy in e-learning and other educational experiences. New York: Wiley.
American Association of School Librarians (AASL) and Association for Educational Communications and Technology (AECT). 1998. Information power: Building partnerships for learning. Chicago: ALA.
American Library Association (ALA). 1989. Presidential committee on information literacy, final report. Chicago: ALA. www.ala.org/ala/mgrps/divs/acrl/publications/whitepapers/presidential.cfm. Accessed Dec. 4, 2008.
Bingham, J. 1994. A comparative study of curriculum integrated and traditional school library media programs: Achievement outcomes of sixth-grade student research papers. EdD dissertation, Kansas State University, Manhattan.
Blumenfeld, P. et al. 1991. Motivating project-basing learning: Sustaining the doing, supporting the learning. Educational Psychology 26: 369–98.
Bransford, J. and B. Stein. 1993. The IDEAL problem solver. New York: Freeman.
Breivik, P., V. Hancock, and J. Senn. 1998. A progress report on information literacy: An update on the American Library Association Presidential Committee on Information Literacy Final Report. Washington, D.C.: ALA. www.ala.org/ala/mgrps/divs/acrl/publications/whitepapers/progressreport.cfm. Accessed Dec. 4, 2008.
Callison, D. 1986. School library media programs and free inquiry learning. School Library Media Quarterly 15: 20–24.
Chen, S. 1993. A study of high school students’ online catalog searching behavior. School Library Media Quarterly 22: 33–40.
Collins, A., J. Brown, and S. Newman. 1989. Cognitive apprenticeship: teaching the crafts of reading, writing and mathematics. In Knowing, learning and instruction: Essays in honor of Robert Glaser, ed. L. B. Resnick, 223–53. Hillsdale, N.J.: Lawrence Erlbaum Associates.
Costabile, M. et al. 2003. Evaluating the educational impact of a tutoring hypermedia for children. Information Technology in Childhood Education Annual, 289–308.
Craver, K. 1989. Critical thinking: Implications from research. School Library Media Quarterly 18: 13–18.
Dempsey, J.et al. 1994. What’s the score in the gaming literature? Journal of Educational Technology Systems 22: 173–83.
Doyle, C. 1994. Information-literate use of telecommunications. DMLEA Journal 17: 17–20.
Dunn, K. 2002. Assessing information literacy skills in the California State University: A progress report. Journal of Academic Librarianship 30: 26–35.
Eisenberg, M., and M. Brown. 1992. “Current themes regarding library and information skills instruction: Research supporting and research lacking.” School Library Media Quarterly 20: 103–09.
Eisenberg, M., and R. Berkowitz. 1990. Information problem-solving: The Big Six Skills Approach to library and information skills instruction. Norwood, N.J.: Ablex.
———. 2003. The Definitive Big6 Workshop Handbook. Worthington, Ohio: Linworth.
Eisenberg, M., C. Lowe, and K. Spitzer. 2004. Information literacy: Essential skills for the information age. Westport, Conn.: Libraries Unlimited.
Emes, C. 1997. Is Mr. Pac Man eating our children? A review of the impact of video games on children. Canadian Journal of Psychiatry 42: 409–14.
Fitzgerald, M. 1999. Evaluating information: An information literacy challenge. School Library Media Research 2. www.ala.org/ala/mgrps/divs/aasl/aaslpubsandjournals/slmrb/slmrcontents/volume21999/vol2fitzgerald.cfm. Accessed Dec. 4, 2008.
Ford, M. J., and E. A. Forman. 2006. Redefining Disciplinary Learning in Classroom Contexts. Review of Research in Education 30: 1–32.
Gallimore, R. and R. Tharp. 1990. Teaching mind in society: teaching, schooling and lierate discourse. In Vygotsky and education: Instructional implications and applications of sociohistorical psychology, ed. L. C. Moll, 175–205. Cambridge, U.K.: Cambridge Univ. Pr.
Geban, O., P. Askar, and I. Ozkan. 1992. Effects of computer simulations and problem solving approaches on high school students. Journal of Educational Research 86: 5–10.
Gee, J. 2005. What video games have to teach us about learning and literacy. New York: Palgrave Macmillan.
———. 2007. What video games have to teach us about learning and literacy. New York: Palgrave Macmillan.
Gredler, M. 1996. Educational games and simulations: A technology in search of a (research) paradigm. In Handbook of research for educational communications and technology, ed. D. H. Jonassen, 521–39. New York: Macmillan.
Greenfield, P. et al. 1994. Action video games and informal education: Effects on strategies for dividing visual attention. Journal of Applied Develpmental Psychology 15: 105–23.
Gorriz, C., and C. Medina. 2000. Engaging girls with computers through software games. Communications of the ACM 43: 42–49.
Hara, K. 1996. A study of information skills instruction in elementary school: Effectiveness and teachers’ attitudes. PhD diss., University of Toronto.
Harris, J. 2001. The effects of computer games on young children—a review of the research. RDS Occasional Paper No. 72. London: Research, Development and Statistics Directorate, Communications Development Unit, Home Office.
Hirsh, S. 1998. Relevance determinations in children’s use of electronic resources: A case study. Proceedings of the 61st ASIS Annual Meeting 35: 63–72.
Intersegmental Committee of the Academic Senates (ICAS). 2002. Academic literacy: A statement of competencies expected of students entering California’s public colleges and universities. www.universityofcalifornia.edu/senate/reports/acadlit.pdf. Accessed Dec. 4, 2008.
Irving, A. 1985. Study and information skills across the curriculum. London: Heinemann.
———. 1990. Wider horizons: Online information services in schools. Library and Information Research Report 80. London: British Library.
Joyce, M., and J. Tallman. 1997. Making the writing and research connection with the I-Search Process. New York: Neal-Schuman.
Kelly, P. 1998. Transfer of learning from a computer simulation as compared to a laboratory activity. Journal of Educational Technology Systems 26: 345–51.
Kuhlthau, C. 1987. An emerging theory of library instruction. School Library Media Quarterly 16: 23–27.
———. 1988. Longitudinal case studies of the information search process of users in libraries. Library and Information Science Research 10: 257–304.
Laffey, J. et al. 2003. Supporting learning and behavior of at-risk young children: Computers in urban education. Journal of Research Technology in Education 35: 423–40.
Lee, J. 1999. Effectiveness of computer-based instructional simulation: A meta-analysis. International Journal of Instructional Media 26: 71–85.
Loerke, K. 1994. Teaching the library research process in junior high. School Libraries in Canada 14: 23–26.
Malone, T., and M. Lepper. 1987. Making learning fun: A taxonomy of intrinsic motivation for learning. In Aptitude, learning and Instruction volume 3: Conative and affective process analyses, ed. R. E. Snow and M. J. Farr, 223–53. Hillsdale, N.J.: Lawrence Erlbaum Associates.
Malouf, D. 1988. The effect of instructional computer games on continuing student motivation. The Journal of Special Education 21: 27–38.
Mancall, J., S. Aaron, and S. Walker. 1986. Educating students to think: The role of the library media program. School Library Media Quarterly 15: 18–27.
Mark, B., and T. Jacobson. 1995. Teaching anxious students skills for the electronic library. College Teaching 43: 28–31.
Nahl, D., and V. Harada. 1996. Composing boolean search statements: Self-confidence, concept analysis, search logic and errors. School Library Media Quarterly 24: 199–207.
National Center for Postsecondary Improvement (NCPI). 2001. The landscape: A report to stakeholders on the condition and effectiveness of postsecondary education. Change (June): 27–42.
National Commission on Excellence in Education. 1983. A nation at risk: The imperative for educational reform. Washington, D.C.: U.S. Government Printing Office. (ED 226 006)
National Council of Teachers of Mathematics (NCTM). 2000. Principles and standards for school mathematics. Reston, Va.: NCTM.
National Research Council. 1996. National science education standards. Washington, DC: National Academy Pr.
Newell, T. 2004. Thinking beyond the disjunctive opposition of information literacy assessment in theory/practice. School Library Media Research 7. www.ala.org/ala/mgrps/divs/aasl/aaslpubsandjournals/slmrb/slmrcontents/volume72004/beyond.cfm. Accessed Dec. 4, 2008.
Neuman, D. 1995. High school students’ use of databases: Results of a national Delphi study. Journal of the American Society for Information Science 46: 284–98.
Pappas, M., and A. Tepe. 1997. Pathways to knowledge: Follett’s information skills model. McHenry, Ill.: Follett Software.
Pitts, J. 1995. Mental models of information: The 1993–1994 AASL/Highsmith research award study. School Library Media Quarterly 23: 177–84.
Prensky, M. 2001. Digital game-based learning. New York: McGraw-Hill.
Price, R. 1990. Computer-aided instruction: A guide for authors. Pacific Grove, Calif.: Brooks/Cole.
Provenzo, E. 1992. The video generation. The American School Board Journal 179: 29–32.
Quarton, B. 2003. Research skills and the new undergraduate. Journal of Instructional Psychology, 30: 120–24.
Randel, J. et al. 1992. The effectiveness of games for educational purposes: A review of recent research. Simulation & Gaming 23: 261–76.
Rieber, L. 1996. Seriously considering play: Designing interactive learning environments based on the blending of microworlds, simulations, and games. Educational Technology Research and Development 44: 43–58.
Rivers, R., and E. Vockell. 1987. Computer simulations to stimulate scientific problem solving. Journal of Research in Science Teaching 24: 403–15.
Robson, C. 2002. Real world research. New York: Blackwell.
Rosas, R. et al. 2003. Beyond Nintendo: Design and assessment of educational video games for first and second grade students. Computers & Education 40: 71–94.
Shaffer, D. et al. 2005. Video Games and the future of learning. Phi Delta Kappan 87: 104–11.
Shaffer, D. 2008. How computer games help children learn. New York: Palgrave Macmillan.
Sheingold, K. 1986. Keeping children’s knowledge alive through inquiry. School Library Media Quarterly 15: 80–85.
Solomon, P. 1994. Children, technology and instruction: A case study of elementary school children using an online public access catalog (OPAC). School Library Media Quarterly 23: 43–51.
Stripling, B., and J. Pitts. 1988. Brainstorms and blueprints: Teaching library research as a thinking process. Englewood, Colo.: Libraries Unlimited.
Thomas, N. 2004. Information literacy and information skills instruction: Applying research to practice in the school library media center. Westport, Conn.: Libraries Unlimited.
Todd, J. 1995. Integrated information skills instruction: Does it make a difference? School Library Media Quarterly 23: 133–39.
Todd, R. 1998. WW, critical literacies and learning outcomes. Teacher Librarian 26: 16–21.
Van Eck, R., and J. Dempsey. 2002. The effect of competition and contextualized advisement on the transfer of mathematics skills in a computer-based instructional simulation game. Educational Technology Research and Development 50.
Virvou, M., G. Katsionis, and K. Manos. 2005. Combining software games with education: Evaluation of its educational effectiveness. Educational Technology & Society 8: 54–65.
Wenger, E. 1998. Communities of practice: Learning, meaning and identity. New York: Cambridge Univ. Pr.
Wisconsin Education Media Association (WEMA) and the American Association of School Librarians (AASL). 1993. Information literacy: A position paper on information problem solving. ERIC Document ED 376817.
Wiebe, J., and N. Martin. 1994. The impact of a computer-based adventure game on achievement and attitudes in geography. Journal of Computing in Childhood Education 5: 61–71.
Wolf, S., T. Brush, and J. Saye. 2003. Using an information problem-solving model as a metacognitive scaffold for multimedia supported information-based problems. Journal of Research on Technology in Education 35: 321–41.
Woodward, J., D. Carnine, and R. Gersten. 1988. Teaching problem solving through computer simulations. American Educational Research Journal 25: 72–86.
[ Back to Top]