Overview

About the Initiative Evaluation

The Initiative evaluation aims to help Oregon Community Foundation and The Ford Family Foundation understand and document the impacts of the Initiative, to support internal learning and adaptation during the Initiative, and to share what we learn with out-of-school time providers, other foundations and other stakeholders. All of this supports the ultimate goal of the Initiative: to narrow the opportunity gap.

The evaluation is managed by Oregon Community Foundation research staff in partnership with additional evaluation and out-of-school time experts—including Carrie Furrer at Portland State University’s Center for Improvement of Child and Family Services and Corey Newhouse at Public Profit—as well as an evaluation advisory group comprising out-of-school time and evaluation experts. Throughout the evaluation, these experts and program staff have vetted our plans, participated in sense-making and reflection activities to help us interpret and use our findings, and provided feedback on draft materials.

See the full list of folks we want to thank for their contributions in Acknowledgments.
EXPLORE OUR LIST OF Resources & REFERENCES.

Lane Arts Council Arts Design Arts Apprenticeships

Evaluation design

The evaluation was designed to provide both formative/process and summative/impact findings (i.e., to support Initiative implementation and to assess its effectiveness). We took a utilization-focused approach, prioritizing the collection and sharing of information that would be most useful to the Initiative team and participating programs. Though we did not begin with a developmental approach in mind, we did end up employing some aspects of that approach as well, including more rapid-cycle feedback and participatory sense-making processes.

Beginning in 2018, the evaluation team also benefited from participation in the Equitable Evaluation Initiative, which influenced some of our analysis and dissemination decisions, as we aimed not only to support learning through the evaluation, but also to further equity.

The evaluation was organized around three main questions:

  1. How and how well was the Initiative designed and implemented to meet its goals?
  2. How and how well did participating programs implement high-quality out-of-school-time programming to support success for middle school students of color, students from under-resourced rural communities, and students from low-income families?
  3. How and how much did the Initiative and its participating programs contribute to positive youth, parent, organizational and community outcomes?

Oregon MESA

Data collection and analysis

To answer these questions, the evaluation team engaged in a wide range of data collection activities, including:

  • Literature reviews about out-of-school time, the Initiative’s core components and other relevant topics such as social and emotional learning (2014–2020).
  • Annual out-of-school time staff interviews for almost all programs (2014–2020).
  • Observations of dozens of program sessions (2014–2017).
  • Ten focus groups with parents and caregivers (2015–2016).
  • Two rounds of interviews with out-of-school time stakeholders (including other funders and researchers) to learn more about the out-of-school time field in Oregon and beyond (2015, 2017).
  • Support for eight programs that implemented a photovoice project with several dozen students to explore and document their identity in relationship to their community, school and out-of-school time program (2016).
  • Support for participating programs in collecting and submitting student-level data about program participation as well as student surveys about social and emotional learning and program experiences (2014–2017).
  • Use of student participation data from 2014 through 2017 (divided into three entry cohorts based on when they began programming) to track students’ short- and long-term educational outcomes and compare them to those of similar peers. This analysis used data available through the Oregon Department of Education's standardized test scores for math, English language arts, attendance, discipline referrals and freshman credit attainment (2014–2020).
  • Support for planning and reflection relating to learning community activities, collecting participant feedback, and conducting discussion sessions modeled on after-action reviews to help staff continuously improve implementation (2014–2020).

All data was collected with the consent of students, their caregivers and programs.

OSU SMILE Program

Throughout the Initiative, the evaluation team provided summaries of our internal analysis of data collection efforts, usually with an emphasis on whatever would most help Initiative staff improve Initiative and learning community implementation. Evaluation team members also supported program capacity-building—particularly in evaluation methodology—through sessions offered at learning community convenings and webinars, as well as through program-specific coaching on student-level data collection.

Copies of the tools and instructions developed for the evaluation are available on request from kleonard@oregoncf.org.

Data sources

Interviews and observations
Family focus groups
Stakeholder interviews
Program quality data
Student photovoice
Student surveys
Student participation data
Educational data
Carrie Furrer, associate professor at Portland State University, talks about the Initiative evaluation's mixed methods approach.

Evolution of the Initiative evaluation

As the Initiative evolved, so did the evaluation. Though our overarching goals and core evaluation questions stayed the same, two things in particular drove shifts in the evaluation: deepening program quality work, and increased understanding of out-of-school time impacts coupled with a galvanized focus on addressing the opportunity gap.

Hacienda CDC

As work on program quality improvement intensified within the learning community and our understanding of what information was most valuable to programs and Initiative leaders deepened, we shifted evaluation priorities to match. Beginning with the third round of programs, funded in late 2016, we ended student-level data collection in order to focus on supporting program quality work and related capacity-building. We continued to interview staff in person or by phone/video at least annually to capture feedback on the Initiative’s design as well as qualitative input about its impact on programs, staff and students. We also continued to track the students who participated in the first three school years, following them into high school to see what we could learn about their educational outcomes. 

Oregon Community Foundation’s embrace of the opportunity gap frame in 2016 helped articulate the through-line between out-of-school time programs and the achievement gap, validating the Initiative’s focus on providing high-quality opportunities for the youth most likely to experience the opportunity gap. This coupled well with our shifting thinking about how to frame and measure program impacts. Together, these factors are pushing our work—and encouraging others to push their work—toward measures of student progress that connect more closely to what is happening in programming and to measures of conditions that support students (e.g., program quality). This is effectively shifting the burden of demonstrating progress from youth to the adults who control the systems and environments in which we hope students will thrive.

DOWNLOAD A BRIEF, PRINTER-FRIENDLY PDF ON REFRAMING OUT-OF-SCHOOL TIME IMPACTS.

Bend Science Station

We also adapted our methods and our assistance to participating programs based on their needs and interests. For example, in the early stages of the Initiative, foundation and program staff were interested in understanding more about program impacts on social and emotional skill development. While there was a sense that program staff were supporting social and emotional learning and that students were accordingly developing these skills, a lot of questions remained about what competencies to focus on and how best to measure these skills in youth. The evaluation team identified Youth Development Executives of King County’s Youth Engagement, Motivation and Beliefs Survey as a promising tool and worked with participating program leaders to adapt and implement it. As the tool was administered each year, we adapted our guidance and support to programs—as well as our reporting to them of their survey results—to make the process more manageable and the results more useful.