February 2006

http://clue.library.wisc.edu/

Project Co-directors:Abigail Loomis, Steven Frye, Nikki Busch
Institution: University of Wisconsin-Madison


Interviewer: Terrence Bennett

Description:
CLUE (Campus Library User Education) is a Web-based, multimedia tutorial developed by the campus Library & Information Literacy Instruction Program and funded in part by a DoIT Adaptation Grant. The tutorial was designed to be part of a required information literacy module for the campus Communications Requirement courses but can also be used by anyone who wants to learn the basic information-seeking skills and strategies needed in order to start using the UW-Madison campus library system.

Q. It’s evident that a lot of thought went into creating such a carefully designed and visually engaging tutorial. There was obviously a talented team of people working on this project. Was it a challenge to coordinate the development of the CLUE project with so many people involved? How was the project managed from start to finish?
A. We began working on what we thought would be a revision of CLUE but that soon became a redesign in fall 2004 and moved the “new CLUE” into production at the end of August 2005. For the most part, a core team of three librarians (Steve Frye, Nikki Busch, and Abbie Loomis) were involved in the project from start to finish. Initially this team met weekly with lots of e-mail exchanges in between. Over the summer 2005 the team met twice a week. In terms of division of labor, we all worked on drafting outcomes, determining critical content, and developing instructional strategies. Nikki focused on developing rapid prototypes and finalizing them. Nikki and Steve worked out technology issues that came up during the process. Abbie worked on coordinating the administrative aspects of the project: e.g., communicating with instruction librarians and faculty, seeking funding, PR, etc. We also had two instructional design consultants from our campus computing division assisting us who would attend weekly meetings as needed. One of the consultants helped us rethink our initial planning process so that we moved away from the notion of revising the old CLUE to a plan to redesign the whole tutorial. The other consultant was instrumental in helping us design the container for CLUE and in providing feedback on our weekly iterations of the tutorial. He also helped us with usability testing and with technical questions that came up regarding the software. We also had a graphic artist fine tune the design of CLUE’s container. The fact that much of the planning and work of CLUE was done by a small team of three made managing the project easier. The key was to make sure the three of us kept each other updated on whatever aspect of CLUE we were working on.

Q. One of the noteworthy features of CLUE is the care that was taken to include just enough information—not too much or too little. Some of the content in CLUE—such as a comparison of college-level research with high school-level research, or the description of a card catalog to explain an OPAC—is not often found in other tutorials. How difficult was it for the design team to settle on what to include in CLUE—and what to exclude?
A. Once we realized that we needed to totally redo our learning outcomes and instructional strategies and had actually done this, it wasn’t difficult to make decisions on what to include in the tutorial. Our outcomes drove the content and became the referral point for questions that might come up regarding whether to include something or not. One of the things we realized in looking at the content of the “old CLUE” was that we covered way too much; so that favorite maxim of instruction librarians—“less is more”—became our guiding mantra throughout the process and our new outcomes helped us stick to it.

The decision to compare college-level research with high school-level research and to have real-life campus faculty talk about that difference came out of a study we did of first-year students several years ago. The students who participated in the study had all done CLUE as the first of a two-part library module that they did for a required communication course. What we learned from the study was that many of our first-year students weren’t buying in to the need to move beyond Google for their assignments here at UW. Google had worked for them in high school; why should they need to use anything else—like licensed databases—just because they were now in college? So we decided to address this affective issue directly with both a carrot and stick approach in Module 1.

The decision to use the card catalog analogy came from a presentation Steve saw at last year’s LOEX conference in which the presenter described using this seemingly “old-fashioned” analogy with undergraduates and with surprising success! Students in our usability testing over the summer thought the analogy worked as well.

Q. The tutorial compares search techniques in library resources with search techniques in Google. Was there debate among the design team about using Google as a basis for comparison?
A. No debate really. Again, because of what we learned from the study described in the previous answers, we realized that for many of our incoming students Google (or other similar search engines) defined their understanding of the world of information and of the search process. We decided that our pedagogical strategy in both CLUE and the library session that students attend after completing CLUE would be to build on what students already knew about searching via Google to introduce new resources and strategies. Rather than ignoring or disparaging the only information tool many students felt comfortable using, we decided to use Google as a touchstone, a point of comparison whenever possible.

Q. One of the modules in CLUE includes some original music—can you describe how this music came to be incorporated into the project?
A. We have Nikki to thank for coming up with the design of Module 2, our orientation module, in which we not only try to orient students to some (not all!) of the resources and services they might want to take advantage of but also try to continue the carrot and stick affective approach of Module 1 by making students feel both proud of being accepted to UW but also a bit anxious about the size and variety of resources available to them ( i.e., “this isn’t your high school library!”). A voice over with pictures just didn’t seem to carry the emotional message we wanted to convey. Nikki added music by Moby and that immediately transformed the module. Unfortunately, last month our 8-month effort to get copyright permission to use the piece failed. But we had thought that this might happen and asked a local musician to compose a piece that would invoke a similar emotional response. We worked with him on this, giving him feedback through several iterations of the composition and now have a piece that we think helps achieve our objectives for the module.

Q. Tell us about the technologies that were used to create the tutorial and why you chose them. Were there others that you considered?
A. We really wanted to stick to technologies that would allow even semi techno-savvy librarians to edit and update the tutorial themselves—rather than rely on those which would require outside assistance from IT staff or otherwise special training which was the problematic situation we found ourselves in with earlier versions of CLUE. After experimenting with a couple of “Screen Capture” software programs that would allow us to create “moving picture” tutorials, it was decided that we would go with Macromedia’s Captivate as our main vehicle.

Specifically, we like that the program allows one to:

  1. Easily switch out background images within a “movie” while retaining mouse movement, etc.
  2. Quickly add and remove audio
  3. Readily manipulate the timing of the clip and
  4. Allow for embedded quiz questions throughout

Macromedia Dreamweaver and Flash were used to construct the container for the five clips or Modules. However, they would not be truly necessary for novice tutorial creators or those for whom randomly generated versions of quizzes or multi-module tutorials are not a must.

Many of the images that we incorporated throughout the tutorial, such as screen shots, staff photos, and our “Generic Database” search screen in Module 3, were either created or modified for easy viewing via Macromedia Fireworks and/or Adobe Photoshop.

Modules 2, 4 and 5 include links from the final slides out to an interactive image of the Libraries’ web site, MadCat, and a journal article record respectively. These exploratory pages were created using Quiz Image, an authoring software created in-house at UW-Madison by a team of academic and IT staff.

Q. What were some of the challenges (technological or other) that you encountered?
A. Because the Captivate Software has not been around for very long in its current form (we used version 1.01) we have identified a number of challenges that we hope future versions of the product will address.

Specifically, we would like to have more control over the Quiz Results page which the program generates for a user. We’d like to have room for more text on that page explaining that one must print the page before continuing. We’d also like our users to have the option of saving the results page for later printing.

Also, there were a couple of (what can only be described as) glitches in regards to how Captivate saves its files and generates the swf files for publishing online. We learned the hard way that it is best to always “save as” rather than simply “save” the cp files. Unless one continually changes the name of a file (module1-1, module1-2, module1-3, etc.), the result is monstrously large files that are too huge for a normal computer to even open. Every once in a while the sound from one slide would magically appear with a completely different slide in the published swf. One then needs to alter the audio for the slide in question for it to re-save correctly. This is still kind of a mystery to us.

Q. Did you learn any lessons from creating CLUE? If you had the opportunity to do this project over again, would you do anything differently?
A. All in all, the process went very smoothly! We were very fortunate to have three co-directors on this project each with very different learning styles. It not only kept us from getting too bogged down in unnecessary details, but also resulted in a more well-rounded, more widely-appealing tutorial.

In retrospect, we would have probably benefited from addressing the issue of quizzing and how those quizzes would be “randomly generated” earlier on in the timeline. We assumed that there wouldn’t be any problems doing this and saved it for the end of the project, too close to when we needed to move the tutorial into production for the start of the semester. We did manage to get everything up and working quite smoothly in time for the fall semester classes but not without some extra stress.

Q. Reflecting on your work creating CLUE, what do you believe is the tutorial’s strongest feature?
A. The strongest feature is probably the first module that provides students with ‘why’ the topics covered within the rest of CLUE are important to them. As we learned from earlier versions of CLUE, without addressing this ‘affective’ component and getting students to ‘buy in’ to their need to know how to access and effectively use library resources, many students will only half-heartedly apply themselves as they work through the content covered in modules two through five. The decision to devote a whole module to this affective issue was a critical one.

Q. How has the tutorial contributed to or influenced other aspects of your library's instructional services?
A. The process of ‘unpackaging’ and then totally rewriting our learning objectives and of seeing the tutorial through our students’ eyes has greatly affected the scripts we’ve developed for the courses that we teach and the other tutorials that we develop. The lessons we learned about using Captivate have served us well in terms of the other tutorials we are developing.

Q. Are site usage statistics collected? If yes, how are they used? Have you noticed extensive usage of this tutorial from users not affiliated with the University of Wisconsin?
A. Yes, we do collect and keep usage statistics. These statistics indicate whether students are generally working their way through all five modules or only completing part of the tutorial. We haven’t noticed any statistics that indicate extensive use of CLUE by users not affiliated with the UW-Madison.

Q. What are your future plans for CLUE? Do you expect to create additional modules?
A. We expect to conduct user focus groups – both students and faculty/instructors and then to use the information we get from these focus groups to guide us as we revise our tutorial. We are not planning to create additional modules within CLUE. We are hoping to revise some of the CLUE programming in order to make it more elegant. We are also looking at better and more advanced way of gathering the quiz scores.

February 2006 Site of the Month