Community Catalyst

Editor’s Note: This past fall, IMLS launched Community Catalyst, a new initiative aimed at sparking a conversation around ways to help libraries and museums develop a deeper understanding of the best ways to work with communities to bring about positive change. Through a cooperative agreement with Reinvestment Fund, the agency scanned the literature and gathered input from the library, museum, and community revitalization fields. The results of that scan are part of a newly released report (PDF, 28MB). This is the latest in a series of blogs this month highlighting the initiative.

By Dr. Kathryn Matthew
IMLS Director

“Well, I have good news for you. Your blood work is back to normal…that diet and exercise regimen must be working…” And I am thinking…”then why do I feel so awful?” This is precisely the problem we have when trying to figure out if our best efforts are the right efforts: What do you measure and what does positive change look like that? Not only that, what is the evidence of change for the better using the tools and measurements that we do have? While I am not a medical doctor, I do know how I feel and sometimes that is the opposite of what the tests results say.

Those feelings of progress, or lack thereof, were also felt by participants in the September 2016 Community Catalyst Town Hall. An energetic discussion developed over whether a logic model (or “theory of change”) was important to use (analogous to our doctor tracking our health using standard blood tests). A pointed reminder popped up that data collected by others (here again, our doctor and that white lab coat) can often be disempowering or misrepresentative of the individuals and communities that we are trying to work with. Similar to that sense of helplessness that comes when you don’t feel quite right but the results from prescribed lab tests are fine. Overlay all of this with the need to achieve positive performance reviews by grant-making organizations (like IMLS, but also others) and it’s enough to make you sick with stress if you aren’t already so.

It is not a matter of not having the tools at hand to diagnose and take precise measurements: we have formative and summative evaluations, surveys, and year-over-year revenue figures to name a few. We are used to reaching for such tools for measuring contained projects like museum exhibits or new extensions to library buildings. It is much harder to think through the right measures for community change when there are hundreds of potential indicators and dozens of datasets of varying quality and relevance.

To give you a feel for the challenges consider these kinds of pronouncements (and their rebuttals).

"That biotech development will provide more jobs for people in the neighborhood." (Well no, the community doesn't want to change the character of its small bungalows and lose its modest park…plus most of the jobs won't employ them.)


"We can drop off some of our art traveling trunks in the neighborhood schools since the kids can't get bused to the museum." (Yes, but the teachers really want museum staff on site to help with classroom truancy issues and the parents want family art programs on weekends.)

So, what’s up Doc?

Stepping back and considering different ways to “take the temperature” is sometimes necessary when we see symptoms but don’t understand the causes or interactions. How do we evaluate unpredictable or unusual situations and monitor them as conditions change?

Developmental evaluation: It's one of many types of dynamic evaluation approaches which are useful for assessing emerging social change initiatives in complex or uncertain environments. It's an evaluative exercise similar to the role of research and development in private sector product development because it facilitates close to real-time feedback and learning. This helps a team or coalition promptly monitor changes, make rapid adjustments as necessary, and build collective knowledge and experience.

As one Community Catalyst Town Hall participant noted:

"We need to understand what we are evaluating, not just calculate to satisfy the budget office's often far too limited understanding of what we do. Qualitative impact is essential to reaching our potential in addressing the challenges outlined."

In addition to evaluation, there is also value in hearing and building together with the community compelling narratives. Lessons-learned, perceptions, and understandings shared by community members can then help to better define and drive the next piece of action. This could even involve community-driven data-gathering to better understand a newly emerged issue. When driven by the community, such a “shared measures” approach takes away the “white lab coat” outside authority of the evaluator. The data are owned and interpreted by the community, as well as by you. Sometimes it can involve weaving together individuals’ narratives with stories teased out of data.

Using shared measures and more qualitative approaches can stretch you. It may also make your Board or funders uncomfortable. Consider, however, the power of more qualitative indicators like:

  • Amounts and types of participation at community planning meetings
  • Self-motivated follow-ups that occur after the stakeholder meetings
  • How language coalesces around common descriptions of the opportunity or need
  • Formation of new and productive relationships that persist
  • Emergence of a willingness to lend expertise or to influence key stakeholders
  • New leadership stepping forward that engages residents in the project
  • Origin of a community-driven coalition that coordinates, supports, and administers the project
  • Balanced partnership between residents and outside experts—from identifying issues and potential solutions to taking direct actions for community change.

In practice, shared measures means identifying a targeted set of indicators that everyone adopts and applies. It also means using that data to track progress, validate the process to stakeholders, and improve your efforts over time. It may mean compromising on what measures you think are the most important to your organization, and instead focusing on data that improves the overall community project. It also might mean being open to changing your measurements along the way to adapt to new information or issues that surface. And it even might mean being open to having some data collected by those who are not trained evaluators.

In summary, community-centric evaluation using shared success measures can nurture the innovation process. It can also improve the likelihood of your community's successes and your role as a catalyst in promoting well-being.

Note that any resource hyperlinks or footnotes are intended to be illustrative only, are not prescriptive, and do not imply endorsement of any particular resource or approach.

Dr. Kathryn K. Matthew

About the Author
Dr. Kathryn K. Matthew was confirmed by the Senate in September 2015 as the 5th director of the Institute of Museum and Library Services. You can send her comments on the Community Catalyst Initiative by emailing

Community Catalyst