Drawing Lessons from the Studio to School Initiative

About the Evaluation

Through the Studio to School Initiative, we wanted to explore what was possible if dedicated, collaborative teams of arts education champions were given funding and support to build and improve arts education in community-responsive ways. What would they build? How would they build it? How would arts education change in their school communities? And what could we all learn along the way?

Project teams explore logic modeling at a learning community rendezvous.

The Initiative evaluation was designed to support project teams’ use of data to inform program adaptation and improvement and to build their capacity to engage in ongoing evaluative work. By integrating our evaluation into the learning community, and incorporating the evaluation team into the Initiative team, the evaluation became part of the Initiative itself. We hoped the information gathered would be useful to OCF and the broader arts education community while also contributing meaningfully to the knowledge base about arts education.

  • The evaluation was designed for flexibility and evolution. As an evaluation team, we wanted to remain adaptive and responsive, and take risks, just as we asked the projects to do. We adopted aspects of a developmental evaluation, which was still a relatively new approach in the evaluation field. Anticipating that our information needs would evolve, we planned out activities no more than a year at a time, always working toward our overarching learning goals. We used various data collection and analysis approaches, trying creative methods that were new to us and to the project teams. Along the way, we shared what we were learning, often informally, through learning community convenings.
  • Like the Initiative, the evaluation was layered. The evaluation team worked with the in-house OCF Initiative team to continually revisit our learning goals, reflect and adapt. The evaluation team also worked with individual projects to support their own development and to learn about each project, team and community, identifying similarities and differences and facilitating learning across projects.
  • The Initiative learning community was an important venue for evaluation activities. The evaluation team engaged with project teams in planning and reflection activities, facilitated discussions to explore findings and check our collective understanding, and embedded data collection into these activities. Participants remarked at various stages that the learning community felt like a natural laboratory, and that the Initiative felt like a grand experiment or research project.
  • Engaging in co-creation was just as valuable as the outcomes. Studio to School was an incubator for ideas. It was an opportunity to think differently about OCF’s approach to funding, grant programs and evaluation; challenge traditional funder-grantee relationships; and understand what high-quality arts programming looks like across the state. Over time, the value of engaging in the learning community, and of our own growth and learning alongside projects, became increasingly clear. We grew more comfortable with variation, adaptation and being in multiple developmental stages at the same time. We carried the spirit of experimentation wherever we could, building in reflective work to facilitate and capture the learning community’s growing expertise, and ensuring that the information was shared. We made plenty of mistakes along the way, and there are certainly things we could have done better or differently. But we greatly benefited from the emphasis on process rather than outcomes.
  • A key element and result of the evaluation was the development of principles for the pursuit of high-quality, equitable, sustainable arts education programming. We began developing principles based on the first year’s evaluation findings, and then asked project teams to reflect and provide feedback on draft principles over a couple of years. This deepened engagement with the principles each year, allowing us to continually refine the principles and develop tools to support their use in reflection and project development.
  • The later stages of the evaluation were informed by the Equitable Evaluation Framework.™ As practice partners with the Equitable Evaluation Initiative, the evaluation team began reflecting on and adapting our evaluation work midway through the Initiative to better advance equity. This meant centering the value of local perspectives, elevating participant voice and our collective humanity, and not overly formalizing outcomes or holding information or findings too tightly. This culminated in a cathartic session at the final learning community gathering where the evaluation team celebrated and shared draft findings with the Studio to School community.
  • Evaluation reporting was delayed by the COVID-19 pandemic. The Initiative ended with a final rendezvous at Salishan Coastal Lodge in Lincoln City in August 2019. We knew that data analysis and reporting would take several months and aimed to release our final report in spring 2020. But with the onset of the pandemic, the evaluation team pivoted to support response efforts, and work on Studio to School was effectively paused. Throughout 2020, we revisited our findings periodically to consider how to adjust and contextualize what we were learning. OCF continued to use what we learned through Studio to School to inform efforts to support arts education through advocacy and development of a network of arts education champions. In turn, these efforts informed our thinking about Studio to School.

While we recognize the pandemic is not yet behind us, we no longer want to hold back what we’ve learned. We hope that in summer or fall 2021, educators will be in a position to use what we are sharing through this report.

More about our approach

Dance break at a learning community rendezvous.

The Studio to School Initiative and its evaluation were rooted in creativity, adaptation and risk. Through the evaluation, we worked to advance learning and the values of collaboration, humanity and humility. The process was ever-changing, echoing the experimentation, creativity and growth mindset of the 18 Studio to School projects.

We wanted the evaluation to support project development and improvement, and to be approachable and widely useful. Most of all, we wanted it to be a positive and validating process of co-learning, rather than a punitive or overly laborious process. As with any process of experimentation, sometimes we were successful and sometimes we were not. Along the way, we made some tradeoffs and had some difficult conversations about what we could or could not say or prove through the evaluation.

The underlying approach was developmental evaluation, which supports adaptive learning in complex environments.  Traditional evaluation works best in situations where there is a clear path from problem to solution: a plan is created, followed and evaluated. By contrast, developmental evaluation is used for social innovations and when working on problems with uncertain solutions. The developmental approach is flexible; plans are expected to evolve as the work progresses (Patton, 2011; Patton et al., 2016).

The evaluation team developed rigorous activities to collect and provide useful information in an ongoing feedback loop. We focused primarily on gathering qualitative information through site visits, interviews and project team journals. We also encouraged project teams to collect data that met their own information needs. Robust reflection was built into the process. For example, debrief meetings were held after each major learning community and evaluation activity to share and document what we learned. These meetings often turned into planning sessions for future evaluation or learning community efforts.

Project teams share with one another at a learning community rendezvous.

Principles-focused evaluation was just emerging as a distinct evaluation approach in the early years of the Initiative. After learning about it through Michael Quinn Patton’s work, we identified it as a good fit for the Initiative. A set of shared principles would unite the projects in developing a shared understanding of how the pursuit of high-quality, sustainable, equitable arts education should look. As we realized its usefulness, the development of the Studio to School principles became increasingly central to the evaluation.

We began drafting principles based on the first year’s evaluation findings and asked project teams to reflect and provide feedback on several rounds of draft principles. We were thrilled when Patton’s text Principles-Focused Evaluation was published in 2018, as it echoed much of what we’d been doing and laid out a pathway for continuing to build and use the principles.

Find out how the principles were developed and how we worked with them during the Initiative in Studio to School principles.

Finding our way

One of the most interesting, challenging and exhilarating aspects of developmental evaluation is the sense of way-finding that happens during the process. In contrast to other types of evaluation—where theories of change, logic models, evaluation questions, outcomes and data collection activities are often established at the outset—we took the work only a few steps at a time whenever possible, anticipating that our needs would change over time.

Teams discuss project sustainability at a learning community rendezvous.

We began the evaluation with a survey of project teams to identify learning priorities and likely needs, using the survey results to share and refine an initial learning framework through an early learning community rendezvous session. We developed an initial logic model and theory of change for the Initiative broadly, and adjusted them periodically as we refined our goals and clarified our assumptions and priorities. We encouraged projects to work on refining their evaluation plans in a similar way to reflect their evolving information needs.

Our evaluation questions evolved as well. We revisited the questions at several key points during the Initiative; the final set was developed in summer 2019 before the conclusion of the Initiative. Ultimately, we focused on four big questions:

  1. What impact did the Studio to School Initiative and projects have in and across participating communities?
  2. What shared principles for the pursuit of equitable, high-quality, sustainable arts education were identified across the projects?
  3. What did the projects do? How did they look and evolve during the Initiative, and why?
  4. What did OCF learn through Studio to School that could inform future efforts at the Foundation, in Oregon, or by funders and networks elsewhere?

Woodlawn and Portland Children's Museum team members look at photos of student learning and artwork they brought to share at a learning community rendezvous.

Answering these questions and their many detailed sub-questions required us to work at the project and Initiative level.

At the individual project level, we asked questions such as:

  • What arts education programs are the projects developing, implementing and sustaining?
  • What are the outcomes for stakeholders, schools and communities?

Across projects, we asked questions such as:

  • What lessons can we draw from project successes and challenges?
  • What principles are evident across projects?

At the Initiative level, we asked questions such as:

  • How successful was the Initiative in meeting its goals?
  • What has OCF learned about philanthropy’s role in supporting arts education?

Along the way, there were many activities that worked as we hoped, a few that didn’t, and some that impressed us as being more meaningful or useful than we initially could have imagined. The next section recaps our major data collection and analysis activities, including those that weren’t as useful as we anticipated.

Project impacts generated at a learning community rendezvous.

Data Collection and Analysis

We employed various data collection methods over the course of the Initiative, centering and supporting the needs of the projects themselves whenever possible. Many of the activities below overlap, and some were developed in response to what we learned through another activity. While we didn’t end up including every bit of data in our final analysis and reporting, every activity influenced our understanding of the projects and the Initiative.

Theory of change and logic modeling
Project evaluation support
Initiative team reflections
Review of project proposals
Learning community input and feedback
Topical action research
Landscape scans
Key stakeholder interviews
Literature review
E-journals to encourage and capture reflective practice
Principles development
Principles reflection
Rubric development and use
Site visits, observations, project team interviews and facilitated discussions
Budget and expense review
Case studies