Tips for Program Evaluation Forms

ACRL has a standard evaluation form for Sections to use in evaluating their events. Program organizers can obtain the standard evaluation form from ACRL staff. Program organizers may add questions to the bottom of that evaluation form as desired. Suggestions follow.

General

  • The evaluation form should only ask for information that program organizers plan to use in some way.
  • It is helpful to IS Executive Committee to have information after the event about both demographics and participants' experience.
  • Participants are more likely to return forms that are brief and easy to fill out.
  • It is helpful to ask someone not familiar with the event to try out the questionnaire in advance.
  • Program organizers should consider providing contact information for participants to follow-up if they have questions or additional comments.
  • The program chair presents a summary report, including the results of the evaluation, to the IS Executive Committee. This report should be shared with the program’s speakers as well.

Demographics
Sample questions:

  • Type of library: college, community college, public, school, special, university, other (specify)
  • Areas of responsibility (check all that apply): administration, collection development, instruction, reference, systems/technology, technical services, web services, other (specify)
  • Memberships (check all that apply): ALA, ACRL, IS, LIRT, RUSA, other (specify)
  • Years of library instruction experience: 0-5, 6-10, 11-15, 16-20, 21 or more

Participant Experience
Sample questions:

  • What was the most important thing you learned from the program today?
  • How well did this program meet your expectations?
  • If an aspect of the program was confusing or unclear, please describe.
  • Was there anything about the presentation that you would change? (content, length, format, audio, visual, etc.) If so, please describe.
  • Please suggest ideas for future speakers or topics.

Writing Your Questions

Consider using a combination of open-ended, multiple choice, and Likert scale questions. Questions answered using a Likert scale provide quantitative measures. Open-ended questions elicit more information than questions that can be answered with a simple yes/no. Be sure to weigh the amount of information acquired against the effort it takes to analyze it when choosing question types.

Likert scale questions commonly use five-point or four-point scales. There are arguments for both scales ( Response Scales: How Many Points and What Labels?, n.d.), but it is best to be consistent with the scales used by ACRL in the standard evaluation form.

Likert scale examples:

How would you rate the quality of the overall program?

Very High
5
High
4
Average
3
Low
2
Very Low
1

The program was focused on a timely topic.

Strongly Agree
5
Agree
4
Undecided
3
Disagree
2
Strongly Disagree
1

How valuable was the program in helping you meet your goals?

Very valuable
4
Somewhat valuable
3
Not very valuable
2
Not at all valuable
1

Ask for feedback on only one item or speaker at a time: Instead of “rate the suitability of the room for seeing and hearing,” ask separate questions: “Please rate how well you could hear the speaker.” and “Please rate how well you could see the presentation.”

Multiple choice example:
What areas of responsibility do you have in your current job? (Check all that apply)
___ Administration
___ Collection development
___ Instruction
___ Reference
___ Systems/technology
___ Technical services
___ Web services
___ Other (please specify) ____________

  • For multiple choice questions, consider the range of possible answers, and offer an “other” option, if appropriate, with space to enter additional information.
  • If using ranges in the responses, be sure that they do not overlap (0-5 years and 6-10 years, not 0-5 years and 5-10 years).

See Attachment 8 of the Preconference and Conference Program Planning Manual for a sample evaluation form.

Sources

How to develop effective feedback forms. 2000. Total Communication Measurement 2 (3): 3.

Response Scales: How Many Points and What Labels? N.d. Pearson Education, Inc., http://survey.pearsonncs.com/planning/response-scales.htm (accessed July 5, 2006).

Shearer, Kenneth D. and Duncan Smith. 1992. Workshop evaluation: forms follow function. Chicago: Continuing Library Education Network and Exchange (CLENE).

August 29, 2006

 


Instruction Section Home Page

Send us your comments and questions