Museums, Data, and Stories: The Journey of Adopting an Evaluation Mindset

By Paula Gangopadhyay

My introduction to the power of data and evaluation happened serendipitously, near the beginning of my career, when I was leading a large school reform initiative in the early 2000s.

At the time, the public schools within the city were not doing well on the state standardized tests, and the city’s biggest employer was gravely concerned with the rate of attrition and the future of their workforce. Substantial amounts of research and evaluation reports were being generated, but no one was able to figure out where and how to precisely help the district.

I had no formal background in evaluation and had never delved into anything of this complex nature or scale, but I was charged with the responsibility to figure it out. We needed a framework to organize the wealth of data and point to the gaps we needed to address. The ensuing new data report showed how each school was performing in comparison to its local and state peers, and where opportunities for improvement were, from the micro to macro levels.

This work went on to become a model for others: helped the district address issues in a focused manner, which resulted in significant improvements. Eventually, this hands-on experience opened a new career pathway: I went on to work as an independent evaluation consultant along with my day jobs, helping non-profits make data meaningful and actionable.

My journey to becoming a “practicing evaluator” wasn’t straightforward, but it taught me valuable approaches that I employed throughout my museum career and still use at IMLS today. Here are a few highlights:

  • Don’t fear the data. Evaluation does not have to be a scary phenomenon and does not lead to exposing your programmatic or organizational weaknesses to the world.

  • Value what matters. Evaluation is not just about “outputs” data, which most organizations tend to focus on. It’s about looking beyond, at outcomes that can help you chart out your trajectory of improvement.

  • Everyone needs to be on board. Evaluation has to be adopted and practiced as a mindset throughout the entire organization, rather than being the sole responsibility of one department.

  • Don’t stop at stats. Evaluation must be designed to generate both qualitative and quantitative data—stories and stats. That’s what the stakeholders such as funders, policymakers, businesses, and boards find most compelling.

  • Start small but think big. Evaluation is not about conquering the world with one master stroke. Evaluation can start small and grow from learnings, insights, and indicators of progress, as well as failure. It’s okay—and usually better—for the evaluation plan to be tackled in phases over multiple years.

So, how are we adopting and practicing this type of evaluation and data mindset at IMLS? First and foremost, we consider ourselves a learning organization, rooted in the idea that learning is a continuum. We have taken up iterative efforts that may be a “start all” but not necessarily an “end all” solution. Our leadership supports exploring new ideas and we work together to charting out pilot approaches with benchmarks.

With this holistic learning strategy in mind, IMLS and the Office of Museum Services have taken a multi-pronged approach in the recent years:

  • Make the implicit explicit: We called out data and evaluation as specific categories in Museums Empowered, as well as in the National Leadership Grants. In the last few years, we have seen significant rise in applications specifically focused on evaluation and data studies and have funded many valuable projects (PDF 91KB).

  • Incorporate objective evaluation in special projects: We incorporated third party evaluation in all our new special initiatives and cooperative agreements, such as Museums for All access initiative, STEM and 21st Century Learning Centers afterschool project, Museums for Digital Learning platform and resources, Early Learning Network for museums and libraries, and other initiatives. An iterative feedback approach allows continuous project design adjustments as needed.

  • Conduct retrospective evaluation, data and market analyses: In FY 2019, we embarked on three major evaluation and data analysis initiatives that will conclude by the end of this year and yield important findings and insights for IMLS and the museum sector.

    • African American History and Culture Grant program evaluation: In partnership with Urban Institute, this fourteen-year retrospective evaluation study is focused on understanding the universe of African American Museums and HBCUs. The forthcoming report will shed light on key factors that will help us understand who has and has not applied and received IMLS-AAHC grants, the types of capacity building support the grants have provided for institutional growth trajectories, assess how the program has performed and where we can improve. The evaluation incorporated valuable input from external stakeholders including awardees, non-applicants, other funders, and African American sector partners.

    • Internal grants data analysis and synthesis: A five year (2014-2018) retrospective analysis of over 1200 awarded grants from the Office of Museum Services being conducted in partnership with 2M Research. This study will shed insights on a set of key research questions and produce variety of quantitative data points and illustrative project examples that will inform IMLS of opportunity and gaps in six portfolio areas: Learning, Collections, Digital, Museum Professional Development, Community and Diversity and Inclusion. This is the first time IMLS has conducted an extensive internal analysis and synthesis of multi-year awarded grants data and juxtaposed it with a landscape analysis to assess how our grant making is aligned with the evolving needs of the sector.

    • Market Analysis and Opportunity Assessment of Capacity Building for Small and Mid-sized Museums: Together with the Partnership for Public Good, a steering committee of leading museum practitioners and subject matter experts, this study conducted a sample museum sector-wide survey, focus groups, and interviews with stakeholders and funders. This helped IMLS learn what capacity building programs are currently available for small and medium sized museums and explore further efforts that can help the sector. The study also looked at other cross-sector capacity building examples and other funding initiatives for transferrable best practices.

These are just a few ways we are adopting and driving a data and evaluation driven mindset within IMLS and in the museum sector. If we want to elevate, nurture, and sustain the value of museums in improving the socio-economic quality of life and well-being, we must all do—and learn—more.

As for what’s next: IMLS is prioritizing efforts to design and develop a museum–wide survey similar to the Public Library Surveys and explore other efforts that can help us gather and share better data stories. The journey continues.

Is your organization embracing the evaluation and data-driven mindset? Does the museum sector need more training to embark on and sustain such efforts? What are some areas of data gaps in the museum sector where a collaborative effort might help us all? Let’s talk, listen, and learn, and together, let’s work on some tangible steps to address the data and evaluation issues that affect us all.

Paula Gangopadhyay is Deputy Director of Museum Services at IMLS.

Programs
Museum Grants for African American History and Culture
Museums Empowered
National Leadership Grants for Museums