The Institute of Museum and Library Services (IMLS) is dedicated to strengthening museum and library service throughout the nation. IMLS awards support key activities of libraries, museums, and related organizations and encourage leadership, innovation, and partnership.
Congress requires IMLS to document results of grant activities each year to meet the requirements of the Government Performance and Results Act of 1993. IMLS uses grantees' reports to show that our grants make a vital contribution to museum and library audiences and their communities. In our publication, Perspectives on Outcome Based Evaluation for Libraries and Museums (PDF, 215K), museum and library experts talk about the important role that evaluation plays for documenting the work that IMLS supports.
What is outcome evaluation?
IMLS defines outcomes as benefits to people: specifically, achievements or changes in skill, knowledge, attitude, behavior, condition, or life status for program participants ("visitors will know what architecture contributes to their environments," "participant literacy will improve"). Any project intended to create these kinds of benefits has outcome goals. Outcome-based evaluation, "OBE," is the measurement of results. It identifies observations that can credibly demonstrate change or desirable conditions ("increased quality of work in the annual science fair," "interest in family history," "ability to use information effectively"). It systematically collects information about these indicators, and uses that information to show the extent to which a program achieved its goals. Outcome measurement differs in some ways from traditional methods of evaluating and reporting the many activities of museums and libraries, but we believe grantees will find that it helps communicate the value and quality of their work to many audiences beyond IMLS.
Why should my organization measure outcomes?
Many resource allocators have turned to OBE to demonstrate good stewardship of their resources. Museums and libraries use such information to validate program expansion, to create new programs, or to support staffing and training needs. For example, the Pennsylvania State Library was able to use data that showed improved student performance was associated with well-staffed school media centers to influence legislation to provide additional school librarians.
All libraries and museums strive to provide excellent services, to manage programs effectively, and to make a difference in the lives of their audiences. Any kind of systematic evaluation contributes to project quality. The OBE process supports these goals by focusing programs and providing tools for monitoring progress throughout a project. Evaluation is most effective when it is included in project planning from the very beginning. In OBE, planners clearly articulate their program purpose and check it against target audiences, intended services, expected evidence of change, and the anticipated scale of results. Gathering information during the project can test the evaluation process and can help a grantee confirm progress toward goals. This feedback can also help staff modify work plans or practices if expected results are not occurring.
How does a library or museum do outcome evaluation?
Outcome-based evaluation defines a program as a series of services or activities that lead towards observable, intended changes for participants ("a Born to Read program increases the reading time caretakers spend with children"). Programs usually have a concrete beginning and a distinct end. The loan of a book or an exhibit visit might constitute a program, since these have a beginning and an end, and increased knowledge is often a goal. An individual might complete those programs in the course of a single visit. Outcome measurements may be taken as each individual or group completes a set of services (a workshop series on art history, an after-school history field trip) or at the end of a project as a whole. Information about participants' relevant skill, knowledge, or other characteristic is usually collected at both the program beginning and end, so that changes will be evident. If a program wants to measure longer-term outcomes, of course, information can be collected long after the end of the program.
To use a familiar example, many libraries and museum provide information online. They could count the number of visitors to a web page, based on logs any Internet server can maintain. These numbers could indicate how large an audience was reached. Offering a resource, though, only provides opportunity. In order to know if online availability had a benefit, an institution needs to measure skills, attitudes, or other relevant phenomena among users and establish what portion of users were affected.
To capture information about these kinds of results, a library or museum could ask online visitors to complete a brief questionnaire. If a goal is to increase visitor knowledge about a particular institution's resources, a survey might ask questions like, "Can you name 5 sources for health information? Rate your knowledge from 1 (can't name any) to 5 (can name 5)." If visitors rate their knowledge at an average of 3 at the beginning of their experience, and 4 or 5 (or 2) at the end, the sponsoring institution could conclude that the web site made a difference in responders' confidence about this knowledge. It should be clear that such a strategy also lets you test your effectiveness in communicating the intended message!
It is rarely necessary to talk to every user or visitor. In many cases, and depending on size of the target audience and the outcome being measured, a voluntary sample of users or visitors can be used to represent the whole with reasonable confidence. Most institutions find that people enjoy and value the opportunity to say what they think or feel about a service or a product.
The following projects' goals include changing behavior and skills through project activities.
PROGRAM: COLUMBIA COUNTY READ TOGETHER PROGRAM
Program Purpose: The Columbia County Public Library, Columbia Regional High School, Columbia County Head Start, and Columbia County Literacy Volunteers cooperate to provide story hours, literacy information, materials, and other resources to increase the time parents and other caretakers spend reading to children.
1. Make information visits to neighborhood community centers, County Head Start programs, and Columbia High School parenting classes
2. Provide daily story hours for parents and other caretakers and children at library and other sites
2. Provide library cards
3. Provide literacy counseling
4. Connect learners with literacy tutors
5. Provide children's and basic reader materials to meet individual needs
6. Provide a participant readers' support network
Intended Outcomes: Adults will read to children more often.
Indicators: Number and percent of parents or other caretakers who read to children 5 times/week or more.
Data Source(s): Participant interviews.
Target for Change: At the end of year one, 75% of participating parents and other caretakers will read to children in their care 5 times per week or more.
In the program above, the ultimate goal is to improve literacy in the county, but the project has chosen to measure a more immediate and related goal that provides a short-term indication of progress. That goal is frequent reading to children. Information will be collected through a survey of participants.
PROGRAM: MUSEUMS ARE FUN FOR EVERYBODY
Program Purpose: The Museum provides a series of workshops about its programs for mothers and 2- to 5-year-olds from the Hills High School parenting program and Cabot Park neighborhood to increase visits by local families and to increase the museum comfort level of mothers who rarely or never visit the museum.
1. Make outreach visits to Hills High School and Head Start parent meetings
2. Provide 3 Saturday workshops for target mothers and children
3. Provide 3 after-school workshops for target mothers and children
Intended Outcomes: Mothers from Cabot Park and Hills High will feel more comfortable bringing kids to the museum and these families will use the museum more.
Indicators: (a) The number of participating mothers who report their comfort in bringing kids to the museum increased to at least 4 on a 5-point scale, and (b) the number of Cabot Park and Hills High visitors in Kids' Week 2002.
Data Source(s): (a) Questionnaire and phone survey for all mothers who participate in a workshop and (b) random exit interviews of adults who visit the Museum with children during Kids' Week 2001, repeated in Kids' Week 2002
Target for Change: (a) Participants' reported comfort level goes up 75% or more from workshop 1 to 6 weeks after workshop 3 and (b) visits by target families increase from less than 1% in Kids' Week 2001 to 10% in Kids' Week 2002.
In this project, the goal is to increase the museum comfort level of mothers, with the longer term goal of increasing museum visits by families. The project will ask mothers who participate in the workshops to rate their comfort level in the museum on a simple scale (e.g., "5 = very comfortable, 1 = not at all comfortable") to show that mothers feel more comfortable in the museum after the workshops. It will compare information about where visitors during Kids' Weeks 2001 and 2002 live, to see whether the workshop series increased visitors from the target neighborhoods. Questionnaires and interviews can be very short, and provide the opportunity to ask other important planning questions. Of course there might be other explanations for a rise in local Kids' Week attendance, but if the museum did not make major changes in the program or publicity, it will be reasonable to think the workshops made a difference. Outcome-based evaluation has different goals from research or many visitor studies - it simply seeks to document the extent to which a program achieved its purposes.
What if my project isn't intended to change skills, knowledge, or other attributes for people?
IMLS supports basic research, organizational enhancements, and other activities intended to strengthen the ability of organizations to provide high-quality services. Such projects may be designed to extend a discipline's knowledge or to create tools to improve practice, rather than to produce immediately observable benefits for end users. IMLS supports such projects because it anticipates that they will contribute to making lives better in the long term. In reporting results of such grants, IMLS wants to know what you believe long-term benefits will be for library or museum users and their communities, and how those improvements will be recognized when they're achieved.
Questions and resources for outcome-based evaluation.
If outcome-based and other formal program evaluation is new to your institution, many excellent publications are available to introduce them. IMLS offers these pages as a resource, but they are not intended to be limiting or exclusive. Most of these resources draw their examples from educational and social service settings, but many of those examples readily apply to typical goals of library and museum programs. Many of these titles are available at no cost online. While terminology differs from publication to publication, basic concepts are very similar. All these publications are designed for use by organizations who want to know the results of their programs in terms of human benefits, whether those are called "impacts," "results," or "outcomes." If you have additional questions about outcome-based evaluation for IMLS grant projects, contact
Matthew Birnbaum, Evaluation Officer
Institute for Museum and Library Services
1800 M Street NW, 9th Floor, Washington, DC, 20036-5802
Phone: 202-653-4760; Fax: 202-653-4600