Drawing Lessons from the Studio to School Initiative
About the Evaluation
Through the Studio to School Initiative, we wanted to explore what was possible if dedicated, collaborative teams of arts education champions were given funding and support to build and improve arts education in community-responsive ways. What would they build? How would they build it? How would arts education change in their school communities? And what could we all learn along the way?
The Initiative evaluation was designed to support project teams’ use of data to inform program adaptation and improvement and to build their capacity to engage in ongoing evaluative work. By integrating our evaluation into the learning community, and incorporating the evaluation team into the Initiative team, the evaluation became part of the Initiative itself. We hoped the information gathered would be useful to OCF and the broader arts education community while also contributing meaningfully to the knowledge base about arts education.
- The evaluation was designed for flexibility and evolution. As an evaluation team, we wanted to remain adaptive and responsive, and take risks, just as we asked the projects to do. We adopted aspects of a developmental evaluation, which was still a relatively new approach in the evaluation field. Anticipating that our information needs would evolve, we planned out activities no more than a year at a time, always working toward our overarching learning goals. We used various data collection and analysis approaches, trying creative methods that were new to us and to the project teams. Along the way, we shared what we were learning, often informally, through learning community convenings.
- Like the Initiative, the evaluation was layered. The evaluation team worked with the in-house OCF Initiative team to continually revisit our learning goals, reflect and adapt. The evaluation team also worked with individual projects to support their own development and to learn about each project, team and community, identifying similarities and differences and facilitating learning across projects.
- The Initiative learning community was an important venue for evaluation activities. The evaluation team engaged with project teams in planning and reflection activities, facilitated discussions to explore findings and check our collective understanding, and embedded data collection into these activities. Participants remarked at various stages that the learning community felt like a natural laboratory, and that the Initiative felt like a grand experiment or research project.
- Engaging in co-creation was just as valuable as the outcomes. Studio to School was an incubator for ideas. It was an opportunity to think differently about OCF’s approach to funding, grant programs and evaluation; challenge traditional funder-grantee relationships; and understand what high-quality arts programming looks like across the state. Over time, the value of engaging in the learning community, and of our own growth and learning alongside projects, became increasingly clear. We grew more comfortable with variation, adaptation and being in multiple developmental stages at the same time. We carried the spirit of experimentation wherever we could, building in reflective work to facilitate and capture the learning community’s growing expertise, and ensuring that the information was shared. We made plenty of mistakes along the way, and there are certainly things we could have done better or differently. But we greatly benefited from the emphasis on process rather than outcomes.
- A key element and result of the evaluation was the development of principles for the pursuit of high-quality, equitable, sustainable arts education programming. We began developing principles based on the first year’s evaluation findings, and then asked project teams to reflect and provide feedback on draft principles over a couple of years. This deepened engagement with the principles each year, allowing us to continually refine the principles and develop tools to support their use in reflection and project development.
- The later stages of the evaluation were informed by the Equitable Evaluation Framework.™ As practice partners with the Equitable Evaluation Initiative, the evaluation team began reflecting on and adapting our evaluation work midway through the Initiative to better advance equity. This meant centering the value of local perspectives, elevating participant voice and our collective humanity, and not overly formalizing outcomes or holding information or findings too tightly. This culminated in a cathartic session at the final learning community gathering where the evaluation team celebrated and shared draft findings with the Studio to School community.
- Evaluation reporting was delayed by the COVID-19 pandemic. The Initiative ended with a final rendezvous at Salishan Coastal Lodge in Lincoln City in August 2019. We knew that data analysis and reporting would take several months and aimed to release our final report in spring 2020. But with the onset of the pandemic, the evaluation team pivoted to support response efforts, and work on Studio to School was effectively paused. Throughout 2020, we revisited our findings periodically to consider how to adjust and contextualize what we were learning. OCF continued to use what we learned through Studio to School to inform efforts to support arts education through advocacy and development of a network of arts education champions. In turn, these efforts informed our thinking about Studio to School.
While we recognize the pandemic is not yet behind us, we no longer want to hold back what we’ve learned. We hope that in summer or fall 2021, educators will be in a position to use what we are sharing through this report.
More about our approach
The Studio to School Initiative and its evaluation were rooted in creativity, adaptation and risk. Through the evaluation, we worked to advance learning and the values of collaboration, humanity and humility. The process was ever-changing, echoing the experimentation, creativity and growth mindset of the 18 Studio to School projects.
We wanted the evaluation to support project development and improvement, and to be approachable and widely useful. Most of all, we wanted it to be a positive and validating process of co-learning, rather than a punitive or overly laborious process. As with any process of experimentation, sometimes we were successful and sometimes we were not. Along the way, we made some tradeoffs and had some difficult conversations about what we could or could not say or prove through the evaluation.
The underlying approach was developmental evaluation, which supports adaptive learning in complex environments. Traditional evaluation works best in situations where there is a clear path from problem to solution: a plan is created, followed and evaluated. By contrast, developmental evaluation is used for social innovations and when working on problems with uncertain solutions. The developmental approach is flexible; plans are expected to evolve as the work progresses (Patton, 2011; Patton et al., 2016).
The evaluation team developed rigorous activities to collect and provide useful information in an ongoing feedback loop. We focused primarily on gathering qualitative information through site visits, interviews and project team journals. We also encouraged project teams to collect data that met their own information needs. Robust reflection was built into the process. For example, debrief meetings were held after each major learning community and evaluation activity to share and document what we learned. These meetings often turned into planning sessions for future evaluation or learning community efforts.
Principles-focused evaluation was just emerging as a distinct evaluation approach in the early years of the Initiative. After learning about it through Michael Quinn Patton’s work, we identified it as a good fit for the Initiative. A set of shared principles would unite the projects in developing a shared understanding of how the pursuit of high-quality, sustainable, equitable arts education should look. As we realized its usefulness, the development of the Studio to School principles became increasingly central to the evaluation.
We began drafting principles based on the first year’s evaluation findings and asked project teams to reflect and provide feedback on several rounds of draft principles. We were thrilled when Patton’s text Principles-Focused Evaluation was published in 2018, as it echoed much of what we’d been doing and laid out a pathway for continuing to build and use the principles.
Find out how the principles were developed and how we worked with them during the Initiative in Studio to School principles.
Finding our way
One of the most interesting, challenging and exhilarating aspects of developmental evaluation is the sense of way-finding that happens during the process. In contrast to other types of evaluation—where theories of change, logic models, evaluation questions, outcomes and data collection activities are often established at the outset—we took the work only a few steps at a time whenever possible, anticipating that our needs would change over time.
We began the evaluation with a survey of project teams to identify learning priorities and likely needs, using the survey results to share and refine an initial learning framework through an early learning community rendezvous session. We developed an initial logic model and theory of change for the Initiative broadly, and adjusted them periodically as we refined our goals and clarified our assumptions and priorities. We encouraged projects to work on refining their evaluation plans in a similar way to reflect their evolving information needs.
Our evaluation questions evolved as well. We revisited the questions at several key points during the Initiative; the final set was developed in summer 2019 before the conclusion of the Initiative. Ultimately, we focused on four big questions:
- What impact did the Studio to School Initiative and projects have in and across participating communities?
- What shared principles for the pursuit of equitable, high-quality, sustainable arts education were identified across the projects?
- What did the projects do? How did they look and evolve during the Initiative, and why?
- What did OCF learn through Studio to School that could inform future efforts at the Foundation, in Oregon, or by funders and networks elsewhere?
Answering these questions and their many detailed sub-questions required us to work at the project and Initiative level.
At the individual project level, we asked questions such as:
- What arts education programs are the projects developing, implementing and sustaining?
- What are the outcomes for stakeholders, schools and communities?
Across projects, we asked questions such as:
- What lessons can we draw from project successes and challenges?
- What principles are evident across projects?
At the Initiative level, we asked questions such as:
- How successful was the Initiative in meeting its goals?
- What has OCF learned about philanthropy’s role in supporting arts education?
Along the way, there were many activities that worked as we hoped, a few that didn’t, and some that impressed us as being more meaningful or useful than we initially could have imagined. The next section recaps our major data collection and analysis activities, including those that weren’t as useful as we anticipated.
Data Collection and Analysis
We employed various data collection methods over the course of the Initiative, centering and supporting the needs of the projects themselves whenever possible. Many of the activities below overlap, and some were developed in response to what we learned through another activity. While we didn’t end up including every bit of data in our final analysis and reporting, every activity influenced our understanding of the projects and the Initiative.
At both the Initiative and project level, the evaluation team supported the development of theories of change and logic models to help the projects and Initiative team clarify and refine the relationship between the activities underway and intended outcomes. Given the adaptive nature of the projects and Initiative, we encouraged teams to consider these as working models open to revisiting and revising.
The evaluation team also supported the project teams in progressively setting goals, and in reflecting on, documenting and sharing their progress. Each year, the evaluation team provided worksheets and templates to support project-specific evaluation work that incorporated the development of theories of change and logic models, rubrics to describe and track progress, and e-journal prompts to reflect on learning. In year 4, projects were asked to report on their key outputs and outcomes; details from those reports were key to our analysis of impacts.
Before and after each learning community rendezvous, and at three key milestones (in the first year, between the third and fourth year, and in the final year), the evaluation team engaged the Initiative team in reflection and participatory data analysis activities to support Initiative development and inform next steps. The reflections were a source of data, as the evaluation team gained insight into the Initiative team’s intentions, worked with them to develop evaluation findings, and strategized about how to share what we were learning with the learning community and others.
The evaluation team reviewed project proposals and shadowed the program staff developing funding recommendations to better understand the projects and the goals and intentions of the Initiative. Project proposals were a valuable source of information as we tracked how and why projects evolved from their original plans.
The evaluation team used surveys sparingly and for two distinct purposes. An initial survey was administered to gather team members' perspectives on each project, their initial foci, and anticipated possibilities and challenges for their projects. This survey was helpful early on as we worked to understand the projects. We shared survey results to develop an initial learning framework, checking our understanding and gathering further input through an early learning community rendezvous session.
Second, we administered surveys at three time points (early, mid and late Initiative) using an adaptation of the Levels of Collaboration (Frey et al., 2006) to capture collaborative relationships within and across project teams. While these proved interesting—identifying differences in perspectives within teams, as well as new relationships forming between teams—we struggled with how to make the results useful.
The collaboration surveys were fraught with data collection challenges, most notably that project team members and even core organizations and schools changed during the Initiative. While we observed that relationships were changing, we weren’t sure if we were measuring true changes in partnership quality, or other ways in which partnerships were changing. Interpreting the data was tricky: What does it mean when perspectives within a team vary? Or when collaboration levels go down rather than up, even if we know that the project is progressing in other ways?
Ultimately, while we shared this data with project teams and internally, we opted not to include it in our final evaluation reporting, as we don’t feel it helps us answer our core evaluation questions.
Team works on logic modeling for their project at a learning community rendezvous.
In addition to collecting data during learning community sessions, the evaluation team collected input and feedback from participants during or following each learning community activity. We summarized and shared these results with the Initiative team during debrief conversations that incorporated planning for future activities. To capture input, we often used simple worksheets inspired by after-action review and “minute papers” used in educational settings. We typically asked three to five questions specific to the content and goals for each learning community activity. These example questions are from the August 2018 rendezvous, which focused on program quality and explored how the learning community could strengthen the broader arts education community in Oregon.
- Did your work at this rendezvous change or deepen your thinking on what program quality looks like? (Yes, No, Maybe So)
- If so, how? What changed or deepened?
- What did you learn at this rendezvous that you’re excited to implement once you get home?
- Because of your work at this rendezvous, were you able to develop or deepen relationships with other Studio to School projects? (Yes, No, Maybe So)
- Comments:
- Did the rendezvous help you to see how you contribute to building the arts education community in Oregon? (Yes, No, Maybe So)
- Comments:
At the second learning community gathering in fall 2014, we introduced the project teams to the concept of topical action research, intending to help cross-project teams come together to do their own research on relevant topics. We expected this would be of interest to most teams based on the design of the Initiative as a learning community, and because of the learning interests that came up in the initial survey and the first rendezvous.
Project teams grouped into topics of interest (e.g., professional development for teaching artists) and facilitated interesting discussions at that convening and in follow-up conference calls. However, it was soon clear that this was not a sustainable way for the teams to work and learn; they were already too busy trying to get their own projects off the ground to take on additional work. We supported a few groups in gathering knowledge and resources to share over the winter but did not pursue this further as a learning strategy.
The evaluation team conducted three related activities to better understand the arts education landscape. The first was a look at the arts and culture funding available in Oregon, based on data from several funders (including OCF).
The second involved a survey administered to community-based organizations in 2015 to learn about, map and describe the arts education available in Oregon through responding organizations. This resulted in the first Oregon Arts Education Snapshot report, published in January 2016.
A Snapshot of K-12 Arts Education in Oregon, released in June 2019, built on the prior report. We administered a second survey of community-based organizations in 2018, and reviewed data available through the Oregon Department of Education to include some information about access to arts education in school.
During winter 2016, the evaluation team conducted interviews with a diverse group of 13 arts education advocates identified by the OCF Initiative team. Interviewees included art teachers, a former superintendent, a district-level arts specialist, state-level advocates, directors and program managers of arts nonprofits, and arts funder representatives, including OCF leaders.
We designed these interviews to help us understand opportunities and challenges in the state arts education landscape and to explore what "high-quality arts education" means to stakeholders. We also captured interviewee perspectives on the Initiative, its role in the arts education landscape, and what they hoped we would gain or learn. A summary of interview findings was shared internally and infused into evaluation planning and early reports.
The evaluation team conducted two in-depth literature reviews: one on arts education program quality, and one on the benefits of arts education primarily for students. This resulted in two reports: Striving for High-Quality Arts Education and How the Arts Advance Student Learning, both released in 2017. Striving for High-Quality Arts Education includes a draft set of Studio to School principles that was further refined after publication.
Beginning in late 2014, we asked each project team to submit a reflective journal post two to four times per year via a website designed for the purpose. Reflection prompts varied and were developed a few months in advance; usually, one prompt was released at a time, a month or more before its due date. Some examples follow.
February 2015: Think about the work you’ve done so far to collaborate with your project partners and other organizations for your Studio to School project. Reflect on one or more of the following:
- Share an example of how partnership has made a difference in doing this work: Who were you partnering with? How did the partnership help? What did you learn from this experience? What was it that made the partnership fruitful?
- What have you learned about the value and challenges of collaborating with others through the project so far?
- Have you discovered a need for additional partners? For deeper collaboration with existing partners? If so, why? How are you adjusting?
- What lessons would you share with others who want to form similar partnerships to work on arts education?
May 2016: During project evaluation planning this year, teams were asked to define what success would look like with regard to student arts learning. We asked: What knowledge, skills and abilities related to the arts would students gain, and how would teams know that learning happened? As the school year begins to wrap up, we’d like teams to reflect on what they’ve learned about student learning. Your post should answer the following questions:
- How did your team define success with regard to student learning? What evidence did you expect to have per your plan?
- How did you track or measure student learning this year? Share any tools used (e.g., rubrics).
- Have you seen (or perhaps heard) and collected evidence of student learning? Share a summary or examples of evidence of student learning.
- What did you learn about tracking or measuring student learning this year (either about the process or the results)? Did you discover anything unexpected? Was anything particularly challenging?
- How will you use what you learned to inform your efforts next year?
September 2017: We’ve had the future on our minds lately. At the recent rendezvous, in spring interviews, and in your grant renewals, we’ve been talking about what your programs (and the Initiative) will look like in years 5, 6 and beyond. For this reflection, we’re asking you to take your vision for sustainability, and your dreams for your projects and communities, one step further.
Imagine that it is the year 2020. What will arts education look like in your community? What’s the story you want to tell about your project’s impact?
April 2019: We spent time at the February Project Leads meeting talking about the Studio to School learning community and what role this network could play in the future of arts education in Oregon. As follow-up to that conversation, we’d like to capture your reflections about how the learning community has impacted you more broadly.
As you think about your experience with the Studio to School learning community:
- How has the learning community impacted your project, organization, school or community (your choice)? How do you think it might shape your work going forward?
- What has been the most useful aspect for you? Is there a particular topic, event or activity from a previous meeting or rendezvous that stands out to you as especially impactful?
- What has been most challenging for you? What do you wish OCF did differently, or would do differently, when developing learning communities like this in the future?
Project teams could make posts visible to everyone in the learning community, or only to the evaluation team, as desired. The Initiative and evaluation team also posted periodic reflections, either responding to the prompt or summarizing and analyzing the project team posts.
Along with site visits and project team interviews/discussions, these posts were one of the evaluation team’s most valuable information sources. They were filled with rich learning, examples of arts education and stories from each project community. While we know this was a lot of work for the project teams, we also heard that they were valuable—encouraging reflection, connections and communication within project teams at key points in the Initiative.
A central element of the evaluation was the development of the Studio to School principles. The evaluation team facilitated a collaborative process in which the learning community identified and explored common elements that contributed to the success of each Studio to School project. Through a participatory process, we then shaped these common elements into seven principles that provide a framework for high-quality in arts education.
We engaged over 100 people in identifying, developing and refining the principles over five years, including school administrators, arts organization leaders, arts educators, other arts education stakeholders, the evaluation team and other foundation staff, and members of an evaluation advisory group. Many were longtime arts education champions; others were newer to the world of arts education but also contributed valuable insights and fresh ideas.
To develop and refine the principles, the evaluation team facilitated many activities within the learning community and with other stakeholders. This work began with a brainstorming session at the first gathering of project teams in August 2014. Workshopping sessions in 2016 and 2017 with Studio to School representatives, evaluation advisory group members and other arts education stakeholders helped us refine the principles and develop a rubric to support reflection using the principles. The resulting Studio to School principles and supporting tools (including the rubric) were greatly informed by many project site visits to observe programming and interview staff, reports provided by teams to share what they were learning as they developed and improved their programming, and by our review of research about high-quality arts education.
The evaluation team supported the project teams in using the principles to reflect on their work and plan next steps through the following approaches:
- Ideating and reacting with dot voting and sticky notes: In year 2, one of the first ways project teams engaged with draft principles (generated by the evaluation team based on site visits, project reflections and other year 1 activities) was by using dots to vote for the principles most relevant to their projects, and sharing examples of what the principles look, sound and feel like in their projects via sticky notes. Principle text, dots and sticky notes were displayed on posters to allow for a “gallery walk” that encouraged participants to learn about the principles’ relevance across projects. Though we did this in part to narrow down and refine an early draft of the principles, the examples generated also helped us improve the principles’ language and related materials.
- Individual and team reflection: As year 3 got underway, we asked team members to reflect individually on draft principles, [Can include photo or embed version of Hood River worksheet] and then to share those reflections within project teams to explore differences of perspective. To this end, the evaluation team developed a simple worksheet and a set of stickers with images that represented the extent to which a principle was “in practice” in a project: 1) this principle is growing and thriving in our project, 2) my team is experiencing challenges with this principle, and 3) my team is not working on this principle right now. As evaluation team members read and described each principle, team members chose the sticker that best reflected their team's current relationship to each principle. This helped team members get more familiar with the principles and think about how they were or were not reflected in their projects (due to relevance, ongoing challenges, or variations in priorities/goals). Not surprisingly, many participants got creative, splitting stickers in two or piecing two or more stickers together where they didn’t feel a single category quite fit for a given principle.
The subsequent team discussions were revelatory for some. The prompt As a team, select a principle that one or more team members identified as challenging. Discuss the challenges, and how this principle might help you move through them. kicked off big conversations about differences in perspective, complex challenges, and goal-setting that team members reported were very meaningful, if also challenging.
Throughout the Initiative, the evaluation team facilitated conversations with project teams that included reflecting on one or more Studio to School principles. In addition, many project teams initiated their own reflective conversations, often to support completing worksheets or e-journal posts. These conversations also took place during project planning sessions at learning community convenings. Examples of guiding questions include:
- What is going particularly well now, or has gone especially well in the past regarding this principle?
- Where would you like your project to be at the end of the year regarding this principle? What will need to happen to get there?
In years 3 and 4, the evaluation team worked with interested project team members, the evaluation advisory group and other arts education experts and stakeholders to develop a rubric that would articulate what the principles look like in practice. Several workshop sessions were held in Portland, Eugene and Hood River to maximize participation, including a session with E. Jane Davidson (an evaluator with expertise in rubric use). Questions that helped to create and refine the rubric include:
- What does it look like when a program is living this principle fully?
- What does it look like for schools that don’t get it? What’s the status quo?
- What are the most important component parts of each principle?
- Where might we be setting the bar too high or too low?
Projects piloted the draft rubric, considering how one or more principles looked for their project. Teams submitted responses to reflection questions as well as feedback about the rubric tool and piloting process. Examples of prompts and questions include:
- Think about each component of your chosen principles, and determine which description describes your program best. You may find that some aspects of your program are in one stage, and other aspects are in a different stage.
- What evidence do you have that your program is in a certain stage?
- What evidence would you need to indicate that your program has moved to the next stage?
- What would someone disagreeing with you say? What evidence might they cite to support their opinion?
Questions about the rubric pilot process included:
- What was the process like for your team? When you came together to talk about the rubric, was there agreement among team members? What differences, if any, surfaced through this process?
- Did filling out the rubric help you reflect on your project’s progress? What, if any, new insights about your project did this generate? What, if any, aspects of your project did you identify as in need of improvement or attention through this process?
- What changes to the rubric itself or the process would you recommend?
The evaluation team incorporated that feedback—and other adjustments recommended in conversations with the project teams—into a second iteration of the rubric. In the final year of the Initiative, teams used the rubric to reflect on their projects in regard to one or more principles. Reflective questions included:
- What rubric stage best reflects your project (or arts education programming) regarding this principle as of the beginning of the school year? Why did you choose this stage?
- When you think about the principles your team focused on this year:
- What went particularly well this year? Why?
- Were you able to make progress on the principles you selected? If so, how would you describe that progress? If not, what got in the way?
- Was selecting and reflecting on the principles helpful? If so, how has this shaped your work going forward?
Perhaps our most valuable sources of information were the interviews, facilitated team discussions, and observations that often occurred through project site visits (and occasionally remotely due to weather and travel constraints).
The evaluation team visited every project team during spring 2015 (the first full school year), spending time getting to know their communities, seeing programming in action, and talking with project teams about project development to date and project goals.
We conducted similar visits in early 2016, which gave us the opportunity to clarify our understanding of their projects gained through other reporting (including e-journals), to again see arts education in action, and to explore how project team members were thinking about arts education quality.
In 2017, to minimize the burden on project teams as they were planning for decreased grant funding, we conducted telephone interviews with project leads. Whenever possible, representatives from both the community-based organization and the school or district participated. Interviews focused on project status updates—including how partnerships were developing and project team plans for the next school year—as well as reflections on arts education equity and sustainability.
In 2018, the evaluation team visited the projects identified as potential case studies with the intention of learning more about the communities in which they were working (as well as seeing more arts education programming in action in order to better describe it in reporting). Our visit plans were tailored to each project; in some, we attended and observed community events, and in others, we interviewed arts educators or other project stakeholders.
In 2019, we visited every project once more, facilitating a reflective conversation with core project teams (and sometimes their invited guests) to explore each project’s journey, surface key milestones and related lessons learned, and reflect on how the Studio to School principles were reflected in their journeys. Ahead of these visits, we pulled together project timelines to help teams remember and reflect on their work, and to check our understanding. We also spent some time looking forward—talking about where they hoped their projects would end up after the Initiative.
Each year, project teams submitted expense reports to the OCF program and evaluation teams. This helped the program team monitor grant spending and the evaluation team understand how teams were deploying resources, and what resources were required for various aspects of the projects. We also hoped to learn more about how projects were working toward sustainability: How were project budgets impacted by the step-down in Initiative funding in year 4? Was funding for coordinator positions or arts educators shifting from project budgets to school budgets over time?
While the evaluation team summarized the expense tracking data in 2015 and 2018 (when project funding decreased) and shared what we were seeing informally, neither the evaluation nor the Initiative team found the information useful enough to warrant building the capacity needed for further analysis or for more formal reporting. Thus, we opted to not work further with this data.
That said, project teams also reflected on their budgets, expenses, and plans for sustainability through e-journal posts at key points. They shared summaries of these reflections with the Initiative team and incorporated them into other analysis relating to key Initiative themes and lessons.
We knew at the outset of the Initiative that it would be useful to incorporate project stories and anticipated that case studies would be part of the evaluation. Mid-Initiative, we identified eight projects as potential case studies, strategically sampling projects for a range of community contexts, approaches, disciplines, and opportunities for learning, with a strengths-based lens. Through the case studies, we wanted to tell stories about what worked well and how arts education looked in different settings.
We began following the journeys of eight projects more intensively, eventually winnowing our group of case studies to five. We asked these projects to complete extra work relating to the principles via the rubric pilot and to host or help coordinate additional evaluation team visits and interviews to better understand their work and development. Each project was compensated with a small grant for both years in which they did extra evaluation-related work.
To complete the case studies, we reviewed every detail collected about each Initiative project and identified key themes and lessons to highlight. Our expectations and intentions evolved as the key themes of the broader Initiative evaluation came together, but we hoped that the case studies would illustrate:
- Project-specific successes and lessons learned.
- How the Studio to School principles manifest in projects and support their success.
- How programs are evolving, adapting and responding to their communities.
Most data collected for this evaluation was qualitative, which was typically done via thematic coding—often using Dedoose—with a single coder or team of coders, depending on the piece of data or time period of data collection. (Generally, two or more evaluation team members would work on data collection and analysis at any given time.) Using Dedoose allowed us to periodically revisit, adjust or add coding to data sets. Although the coding tree and descriptors used to organize our data evolved, a final round of coding was conducted in 2019 to ensure that we could draw the information needed consistently across projects and over time.