Friday, February 8, 2008

Smallball evaluation of LCCC's Research Skills Activity

I learned at NEFLIN's Feb. 6th "Measuring Your Impact" workshop that I am a smallball information literacy librarian.

"Teams with many home run hitters should play powerball, and teams with players who run fast and are adept at hitting the ball to particular spots are best suited to smallball. The worst thing to do is to attempt power baseball with smallball personnel, or vice versa." (Friedman, 2005)

SMALLBALL STRATEGY: According to the article "Smallball" Evaluation: A Prescription for Studying Community-based Information Interventions' (presented to us at NEFLIN's Measuring Your Impact workshop), there are two approaches to evaluating academic outcomes. The "powerball" approach is scientific, often incorporating complex statistical software, time-intensive surveys, and peer-reviewed interpretations. The drawback is that the approach that emphasizes outcomes may overlook the developmental steps that truly address users' needs. The "smallball" approaches, however, "conducted across the life cycle of an information intervention have the potential to tell interested parties what they really need to know, in time, to maximize the chances that a project can be successful. Often, smallball studies are more informative than definitive." (Friedman, p. S46)

SMALLBALL IS GOOD MENTAL PITCHING: This research is timely for me, at this point in LCCC's preparation for accreditation, because the library has been successful in refining the Research Skills Activity because of our smallball approach. Throughout each term, the reference and circulation staff stay alert to student feedback with every worksheet activity that is turned in. Every student must complete every question correctly before they are given the "Red Sticker" that validates their completion of the library orientation. If a student answers incorrectly, staff reviews the question and demonstrates how one would access the correct information. With every question and comment, we confer among ourselves:

  • Are the directions confusing? Should we change the language?

  • Should we add or delete content?

  • How do we highlight directional information that is critical to their navigational skills?

  • Have we chosen the databases that feature instructors' preferences?

  • Can we choose or add another question that better aligns the informational literacy skill with the library's and college's mission, long-term, and short-term goals?

SPEED AND TIMING: Our on-going evaluations are driven by students' everyday feedback, and our speedy interventions. To date, we have revised the Research Skills Activity every term since 2001, and regularly revise it three or four times during the term. We also customize the activity to highlight appropriate databases for specific disciplines, and request that students complete a short evaluation of the activity.

WE DON'T PLAY GORILLA BALL: We have not found any logical reason to keep records of "incorrect answers" that we tally at the end of the term, to compare results of answers of online vs. face-to-face instruction, or any other definitive, empiracle study. Although I have seen larger library staff at other community college libraries implement some "powerball" approaches to evaluating their information literacy instruction, this is beyond the scope of a one-librarian operation.

BUNTING OFFENSE: Our flow chart of developing and evaluating the Research Skills Activity & Plagiarism Tutorial is a process of definite steps in which we can respond to immediate needs, quickly re-design and implement the activity, and observe responses of the library user. The process is a self-correcting system that intervenes in time to assist the library user to be successful at the point of need.

INFORMED BY OTHER SMALLBALL PLAYERS: The process is more than just casual observations. We conduct focused evaluations that allow immediate feedback to impact the process, and undertake interventions that suit the "smallball" reference staff. We keep abreast of ACRL's publications and standards to inform the instructional design of information literacy. National (ALA, ACRL) and regional (NEFLIN, CCLA) professional development opportunities align LCCC's reference staff with a shared mission to make it possible "for students to better achieve what they have come to do: conduct good research that will facilitate their sucess in college." (Veldof, 2006) Professional contacts at regional workshops are especially important for the one-person reference department that has fewer daily opportunities to exchange ideas about instructional design and evaluations.

BASEBALL LOGIC: Yes, "swinging for the fences" may be an exciting strategy, but with smallball personnel (limited library resources), we advance our runners (students) by hitting the ball in particular directions (immediate personal attention), and assuring that that they arrive safely at the next "base" (level) in their informational literacy research skills. Continued collaboration with instructors and students, and seasonal training assures our team of effective smallball strategies!

1. Friedman, Charles. (2005). Smallball evaluation: a prescription for studying community-based information interventions. Journal of Medical Library Association. Supplement, Vol. 93, p43-48.

2. Veldof, Jerilyn. (2006). Creating the one-shot library workshop - a step-by-step guide. Chicago: ALA.

3. Bunting image. Retrieved Feb. 8, 2008.