Overview
What We're Learning
Through the Initiative, our understanding of what high-quality out-of-school time programming looks like, and how philanthropy can support it, has both deepened and expanded.
Our appreciation for the learning focus of the Initiative has also grown as we’ve seen how it allows for a responsive, adaptive Initiative that supports staff, program and field development. And we’ve learned a lot about the value of out-of-school time programs and their role in students’ lives, shifting our thinking about how to frame and measure the impact of these programs.
While much of what we’ve learned aligns with existing research, some aspects of the evaluation and its findings reflect nascent or evolving trends, and some are relatively specific to the Initiative’s design. We hope that sharing what we’ve learned will help other funders, networks, programs and evaluators build and improve their own programs and initiatives.
read more about what we’ve learned in From Quantity to Quality: Lessons learned from an ongoing statewide initiative.
EXPLORE FURTHER considerations for applying what we’ve learned IN MOVING FORWARD.
About out-of-school time programs
Programs can position students for success and benefit communities in valuable ways. Still, their role in supporting students is often underappreciated within the broader educational system.
Even as participating students are engaged and motivated toward academic futures and supported by peers and adults in growing their confidence, skills and abilities, systemic barriers remain, including those in the educational system. While our evaluation has demonstrated the value of out-of-school time programs in supporting students and their families, we have also come to appreciate that these programs are just one piece of the broader student experience, and one part of the system of supports that students need in order to thrive. They cannot close the opportunity gap or eliminate educational disparities alone. Addressing structural racism and resource barriers will require systemic changes to education.
Our experiences with the Initiative so far, and evidence gathered through the evaluation, confirm that a wide variety of programs can be effective. Culturally specific and locally grown programs; those run by schools, by community organizations or through organization-school partnerships; and those with varying content foci can all benefit students, and each can incorporate meaningful connections to academic learning. We have also learned how tricky it can be to categorize programs for the purposes of evaluation. Most programs are doing more than one thing in more than one way, making it difficult to apply even broad labels in a useful way.
While we saw positive patterns for some types of programs in our educational data analysis (such as programs focused on social and emotional learning), the evaluation also shows that all programs can strengthen program quality. We also gained insights about student experiences across a wide range of programs through student surveys and photovoice projects. While some specific patterns emerged through the student survey, we also see that students are generally quite positive about their experiences in programming, and that how programs support students’ sense of belonging and confidence makes a difference. Ultimately, having a diverse set of programs is important for meeting student and family needs statewide.
The diversity of Initiative-supported programs also strengthened the learning community and added nuance to our understanding of what effective programming looks like and what “effective” means. How do we define “effective”? Effective for whom? Answers to these questions vary by program in ways that are challenging to capture through an Initiative-wide evaluation.
The themes of improved academic engagement and mindsets, an increased sense of belonging, and broader social and emotional learning resonate across participating programs and across the perspectives of students, staff and families. We are encouraged by our findings that indicate a relationship between program participation and positive student outcomes in general, as well as by detailed findings on culturally specific programs, programs focused on social and emotional learning, and academically intensive programs.
The evaluation has also reinforced the importance of programs with strong academic support, positive adult role models and family engagement. Each of these things can take different forms in specific programs. "Academic support" can mean math enrichment in one program setting, and skateboard-making classes in another. Positive adult role models are often staff but can also be near-peer mentors (e.g., college students) or community members that students interact with through service learning projects. Family engagement can take the form of family-focused events, coursework to help families navigate the educational system or their student’s development more generally, or personal relationship-building between program staff and families. Our evaluation gathered evidence of the value of all these variations and many more, giving us a richer understanding of how participating programs support students and families.
Finally, we have come to recognize that many programs serve students of color who have historically been marginalized and denied opportunities in a public education system built primarily for white, middle-class students. Their cultural worldviews are often not reflected in school, and standardized assessment tools often do not measure their achievements. This elevates the need for culturally specific programs that create supportive spaces for students of color—including Black, Indigenous and Latino/x students—as well as the importance of developing evaluation capacity to build evidence for the efficacy of these programs.
Culturally specific programs are likely to be focused on cultural strengths, deeply connected to their respective communities, with staff and leadership that reflects that community. Funding culturally specific programs puts communities in control of determining how best to meet student needs in ways that center their values and traditions, and that reflect community experience and expertise. Emerging research on what constitutes high quality in culturally specific out-of-school time programs takes a more ecological approach to considering the needs of the whole child and the connection to family, community and school. An evaluation of two culturally specific out-of-school time programs in Minnesota found that:
While these culturally focused, community-connected, student-centered, and trauma-informed programs help students with homework and support academic development, they are more comprehensive in supporting all dimensions of healthy youth development, including strong cultural identity, which is positively related to academic outcomes (Patton & MartinRogers, 2018, p. ii).
For racial/ethnic cultures that have experienced racism and oppression in the school system, the ability to reframe academic interests and settings is promising. There is also growing evidence of the value of culturally specific approaches within education and of culturally specific organizations more broadly (e.g., Gray, 2018; Curry-Stevens & Muthana, 2016).
Because a wide range of programs, approaches and contexts can be effective, the YPQ defines high-quality out-of-school time programming in a way that is both content- and context-neutral. While leaving room for this variation, the YPQ outlines universal program elements or components that indicate high quality, including strong youth-adult relationships.
This rich, detailed framework—in combination with the support provided for engagement with the program quality improvement process through the learning community—helps programs generate meaningful improvement plans. It has also established a shared way of talking about program quality across the diverse set of Initiative participants; programs of various types and in varied contexts could see what they had in common and more readily consider how they might apply what others were doing or learning in their own programs.
Regardless of content or context, adult-youth interactions and relationships are central to high-quality programs. Staff relationships with youth are particularly important in middle school, because students are becoming more independent and exercising greater control over what activities they participate in. Staff credibility and interaction with youth can greatly impact student enrollment and retention (Little, 2006). Youth who experience positive support from adults during out-of-school time enjoy themselves more and are more engaged in programming (Grossman et al., 2007).
Research from the Student Experience Research Network illustrates that students who feel valued and respected can dedicate more cognitive resources to learning because they don’t have to monitor the environment for clues about whether they fit in or how their culture is perceived (Romero, 2018). CASEL and others are also increasingly recognizing the importance of adult social and emotional learning to effective programming for adolescents (Skoog-Hoffman et al., 2020; Schwartz et al., 2020).
Without adults who are well trained and supported in running programming, many of the conditions for student success wouldn’t exist. Adults also play a central role in program quality improvement itself. While this is something we knew at the outset of the Initiative—hence the focus on positive adult role models—we now have an even deeper appreciation for the importance of program staff. Fortunately, the Initiative’s learning community’s professional development opportunities have provided strong moral support and an incentive for staff to stay with programs and continue building their skills and expertise in youth development.
Our people are smarter—they know more now. We didn’t lose anyone.
—Program staff
FIND OUT WHAT ELSE we’ve learned about program staff in positive adult role modelS: A learning brief.
Most, if not all, Initiative-supported programs have experienced the following challenges:
- Limited availability of funding, especially from funders willing to invest in existing programs.
- High staff and management turnover (common among under-resourced youth development organizations).
- Lack of time for, or access to, adequate professional development for staff.
- Insufficient resources to pay staff a living wage (likely related to the high rates of turnover).
- An ongoing need to develop capacity to conduct or commission their own program evaluations, including identifying and measuring the most relevant outcomes for their communities, data collection/tracking, analyzing and interpreting data, and improvement planning.
- Lack of time, capacity or ability to coordinate with other organizations or schools.
Program leaders and staff continue to raise these issues in our discussions about the challenges they face and how the Initiative can best support them. We have attempted to address a few of these shared challenges, including through learning community workshops. We believe these challenges remain persistent and widespread across programs in part because they relate to broader structural or systemic challenges, far beyond the control or purview of the Initiative or programs themselves.
Despite the persistence of these issues, we have also seen signs of progress. Some programs have strengthened coordination with others through the learning community, and many have built at least some evaluation capacity. As the assess-plan-improve cycle is increasingly embedded in organizational culture, and expectations for high-quality programs are more integrated into program designs, program quality will continue to improve even if some of these challenges seem intractable.
READ about progress on these challenges IN the program & field-level sections of Initiative & PROGRAM impacts.
About supporting out-of-school time program quality improvement
The evaluation has demonstrated that quality improvement is valuable for programs and therefore for the students they serve. Strengthening program quality is an ongoing process even for the strongest programs.
Engaging in the program quality improvement process requires substantial coordination and time. The logistics of self and external assessment require detailed planning from all involved. We learned early in the Initiative that it was unrealistic to ask programs to simultaneously expand the number of students served and engage in program quality improvement.
Download a brief, printer-friendly synopsis of our findings on program quality.
The assess-plan-improve cycle is rigorous when done well; program staff need a range of training and coaching supports to do program quality improvement in a meaningful way. Still, when asked what they most value about participation in the initiative, program staff often emphasize the value of program quality improvement.
When well supported and implemented, the YPQ process can shift programming and organizations in ways that outlast staff turnover (Smith et al., 2012). High-quality practices become embedded in the way that organizations run programming. Staff who engage in program quality improvement typically find it a valuable professional experience. Program leaders and staff deepen their knowledge and skills and note that program quality improvement validates the importance of their work, encouraging them to set goals and improve their own practice. Even if they leave one program for another, or for another position in a related field, the experience and knowledge they gained through the YPQ process can continue to benefit students.
The YPQ is best used when several rounds of the assess-plan-improve cycle are possible. For most programs, that means engaging in the process for at least three years of programming so that the YPQ tools and process are well understood and utilized. It is often in the third year that we see program leaders and staff engage in the process confidently and gain traction in implementing lasting improvements.
I don’t think it was until year three that we recognized the value of the external assessments. We absolutely dreaded the external assessments because it felt like we had failed in some areas. In year three everything started to click, and it finally got through that this was a snapshot in time and it was to show us things that we just couldn’t see on our own.
—Program leader
Participating programs are expected to engage fully in the Initiative learning community and program quality improvement process. Program leaders and several staff attend at least one in-person convening and often several in-person or virtual trainings each year. The Initiative team is flexible when necessary but considers whether programs are as engaged as possible when making renewal decisions. To support this engagement, grant funding itself is flexible; it can be used for programming and/or to add the staff capacity needed to participate in the learning community and program quality improvement process. Funding use is determined by program staff. In other words, foundation staff trust programs to invest grant funding in the way that best meets their needs and supports Initiative participation.
It has been incredibly valuable to both the success of the Initiative and individual programs that Initiative funders have not—and will not—use YPQ data to make funding decisions or to judge the value of individual programs. Ensuring that the focus on program quality data is first and foremost for reflection and improvement has been necessary for program staff and leaders to honestly assess their practice without fear of consequences.
This, along with overarching efforts to communicate openly and build trusting relationships with program leaders and staff, allows for more honest conversations about program involvement and needs. Program staff can speak freely about what is working and what isn’t and request the support they need to work through challenges. Program staff note that the combination of high expectations for their participation, trust in their expertise and capacity, the Initiative team’s openness and understanding, and the realistic, low-stakes approach to data collection and use makes for more meaningful engagement and promotes a learning mindset.
While the YPQ is intended to work for almost any group-based program, some parts of the framework and tools are inevitably less relevant for certain programs. This can be especially true of culturally specific programs. For example, in Native American-led programs where respect for elders is a central value, the youth-adult relationship may be conceptualized differently or may not be observable in the ways operationalized by the YPQ. In developing the Oregon version of the YPQ tools, we incorporated aspects of cultural responsiveness, given its importance to the participating programs and to other stakeholders at the time. However, adding a few items to the tool is far from enough to ensure that the framework and tools are truly responsive and relevant across all cultural contexts.
Fortunately, program staff can adapt the YPQ and embed their cultural and community norms as they move through the program quality improvement process. This process also gives program leaders and staff an opportunity to discuss how their practice relates to the tool’s constructs, which can help programs refine their goals and design. We were pleased to find that program leaders and staff from culturally specific organizations found the tool very useful, and to see many embrace significant sections of it, connecting it meaningfully to their settings and intentions.
However, we also recognize that this centers a framework, tool and process developed in largely white, dominant-culture academia rather than validating and uplifting culturally specific knowledge and priorities. At a minimum, we know that some aspects of the tool and its constructs can be improved for culturally specific programs. More investment is needed to understand how best to make the YPQ more culturally agile or specific, and what alternatives might better support culturally specific programs.
Family engagement, positive adult role models and purposeful academic support are incorporated in the YPQ framework and are subjects of peer learning opportunities as part of the Initiative learning community.
The evaluation team’s observations and interviews, along with feedback from participating programs, indicate room for continued growth around each of these components. Program leaders and staff find it especially encouraging and useful to discuss practice with their peers. The core components of the Initiative provide a way to organize or focus future peer learning opportunities.
Family engagement approaches vary from program to program. While these differences exist for good reasons—often as a result of responsive, tailored approaches for specific communities—programs can learn a lot from one another about how to connect with and support families. Culturally specific programs often hold expertise in this area; though their context and even goals may differ from other programs, their examples and advice can often be adapted and applied in different settings.
Although positive adult role models are central to the YPQ, which focuses on staff-youth interaction, participating programs have an ongoing appetite for peer learning and potential collaboration to increase staff supports and professionalization. Purposeful academic support also varies by program model and focus, from supporting students in completing their homework, to subject-specific enrichment, to learning that otherwise complements in-school learning. Some programs coordinate or collaborate closely with schools while others struggle to do so. Sharing practices, successes and challenges across programs will continue to benefit participating programs.
About implementing an out-of-school time initiative
Initiative-supported programs vary in their models, organizational and community contexts, and specific goals. Some programs sit within complex organizational structures, while others are the primary focus of their organization. Some have many staff who play specific roles, while others have fewer core staff members who may play multiple roles. Some have sophisticated data systems, and others manage well with simpler systems.
While engaging a wide range of programs presents challenges in figuring out how best to support them and evaluate their impact, the Initiative’s diversity of programs strengthens the learning community and our understanding of what great programs look like and can accomplish. The wide program range has sharpened our focus on the three core components identified at the beginning of the Initiative—positive adult role models, academic support and family engagement—as unifying components of programming. It also underscored the value of the YPQ's neutral framework and its broad definition of "high quality."
The diversity of participating programs provides valuable opportunities for networking, learning and collaboration between programs that might not otherwise have connected as peers. It also helps Initiative team members position ourselves as learners alongside the programs—every program is an expert on its own model, approach and community context. Last, program diversity has reinforced for us that culturally specific and locally grown programs—as well as those run by schools or community organizations (or by organization-school partnerships)—can each incorporate meaningful connections to academic learning and that all of these types of programs can benefit students.
Oregon Community Foundation has used learning communities to support several longer-term, proactive education initiatives, finding them to be a particularly uniting strategy when participants work with shared frameworks like the YPQ. While the learning community was a central part of this Initiative from the outset, its actual design has evolved. We shifted from just a couple of interactive activities per year initially, to a fall kickoff and a mid-year, multiday in-person convening attended by representatives of all participating programs and offering a slate of complementary training opportunities.
Program leaders and staff benefit greatly from being part of a learning community with colleagues who are also engaged in program quality work. They value the opportunity to share experiences with each other, to discover and learn about shared challenges and successes, and to use the language of YPQ to talk about what they do, bridging program focus and context differences.
From the Initiative team’s perspective, it has been incredibly rewarding to watch participating programs learn from one another and build community. More experienced staff help those newer to the Initiative, programs with similar foci or contexts build camaraderie around shared challenges and opportunities, and all are celebrated for their expertise and hard work. The Initiative team participates as much as is appropriate in learning community activities so we can learn alongside the participating programs. This has also promoted relationship-building and power-sharing.
DOWNLOAD A BRIEF, PRINTER-FRIENDLY PDF ON DEVELOPING AN OUT-OF-SCHOOL TIME LEARNING COMMUNITY.
The Initiative team and partners’ commitment to a learning mindset supports responsiveness and relationship-building at all levels. As a result, the Initiative has evolved in valuable ways (e.g., adjusted grantmaking timelines and increased emphasis on quality improvement), demonstrating the value of designing an Initiative to adaptively support learning.
Program leaders and staff appreciate the relational, learning-centered approach taken by Initiative staff and partners, with many noting that it is unusual in comparison to experiences with other out-of-school time funders. Openly and proactively acknowledging that practitioners are the experts in this field (rather than positioning the funders as experts), and elevating program staff as leaders and resources within the learning community, helps build a strong rapport between the Initiative team and program leaders.
Prioritizing learning by encouraging honest program quality assessment that requires engagement in the process—rather than particular scores—and ensuring that YPQ data is never used to make funding decisions also helps program staff be open and honest with the Initiative team about their challenges, which in turn helps the team better support programs.
Foundations often lack the capacity or expertise to deliver trainings and facilitate learning communities directly, instead relying on or collaborating with intermediaries and other experts. The need for a strong intermediary organization that could support the goals of the Initiative and programs more generally was clear at the outset of the Initiative. Over time, and with support from the Initiative, the Institute for Youth Success at Education Northwest has become a valuable resource for Oregon’s out-of-school time field.
Out-of-school time researchers are paying greater attention to the value of intermediaries in supporting quality and otherwise strengthening programs and systems. For example, recent research illustrates the power of intermediaries to support improved data use in citywide out-of-school time systems (e.g., Gamse et al., 2019). One of the intended (and realized) benefits of our work with the Institute for Youth Success is a shift in power away from funders and into the out-of-school time field. As the Institute for Youth Success deepens and expands training offerings, it has engaged program leaders and staff through train-the-trainer opportunities, further elevating them as experts and building community in the field across the state in both urban and rural areas.
About the out-of-school time field in Oregon
Out-of-school time is a diffuse field with many stakeholders, including a diverse array of programs operating in varying contexts. Outside of the Initiative, there are limited opportunities for coordination or collaboration between programs. There is no dedicated state funding to promote professional development in the out-of-school time field, incentivize program quality, or support sectorwide collaboration or coordination.
By developing Oregon-specific YPQ tools through a deliberate, inclusive process; supporting programs in using the tool over a sufficient time to learn what works and what funding and training is needed; and making the tool and what we've learned available statewide, the Initiative has paved the way for other networks and funders to better understand and support program quality improvement.
Programs and stakeholders have been pleased to see greater coordination and collaboration between funders. They note that as programs, networks and funders build a shared definition of program quality, it raises and aligns expectations across the state, pushing them to improve programs and mitigating the reporting burden.
Out-of-school time programs in Oregon often lack stable and long-term funding, especially coupled with supports like the Initiative’s learning community. It is rare to find funding for more than a year or two at a time, and funders are often looking to invest in new initiatives or expanded program capacity rather than in general operating support or in proven, ongoing programs.
Public funding opportunities are particularly limited. Statewide, grants for 21st Century Community Learning Centers are distributed primarily through school districts and educational service districts based on awards determined every five years (the current grant awards are for 2018–2023 and include just 116 of the more than 1,200 public schools in Oregon). That said, this federal funding administered by the Oregon Department of Education does trickle down to some community-based organizations and school-run programs.
The Oregon Youth Development Council has also been a valuable source of funding for some programs, particularly those serving youth disconnected from school or needing significant support to get back on track for graduation, college or career.
Several opportunities are specific to the Portland area—through Multnomah County’s SUN Community Schools and Portland Children’s Levy, for example. However, few—if any—similar efforts exist elsewhere in the state.
Oregon’s Student Success Act, passed in 2019, was expected to generate substantial new resources to fund schools. Some of that funding was to be invested by schools with local community partners, including out-of-school time programs (e.g., to support summer programming). School district Student Success Act implementation planning largely paused during the pandemic, but initial fears that funding would be diverted to fill other education budget shortfalls seem to be easing. While we remain optimistic about the potential impact of the Student Success Act, this illustrates that funding sources for out-of-school time are unpredictable and vulnerable to diversion, especially in times of crisis.
Social and emotional learning is a focus of many Initiative-supported programs. They understand its value and see it as central to their work, whether their programming explicitly focuses on social and emotional learning (or a specific component, such as growth mindset or future orientation), or is an undercurrent or foundation of programming otherwise focused on something else.
We’ve explored social and emotional learning in several ways through the Initiative—including via learning community trainings and discussions, as well as through the evaluation by surveying students—and we’ve identified ways to continue promoting social and emotional learning.
The David P. Weikart Center for Youth Program Quality recently developed a social and emotional learning-focused version of the YPQ tools. Use of these tools is likely to be of the greatest value for programs focused on developing students’ social and emotional learning skills.
Staff themselves are vital to social and emotional learning content delivery and practice; their own capacity for social and emotional learning is foundational to how programs promote these skills and capacities (Schwartz et al., 2020). As we have navigated the pandemic and its consequences, the need for social and emotional learning skills and abilities is all the clearer.
Social and emotional learning is also gaining importance in the national out-of-school time field. Many foundations, networks, programs and researchers are strengthening the knowledge base for promoting social and emotional learning. Goldberg et al. (2018, p. 239) note that “it is important for out-of-school time funders to make the connection between high-quality youth development programs and high-quality environments and practices that foster social and emotional learning."
As we watch social and emotional learning gain attention in Oregon’s educational system, we’ve been thinking a lot about how it can be a bridge between out-of-school time and in-school learning as well as between out-of-school time and workforce development efforts. Each of these is a possible pathway for continued learning and support for social and emotional learning through the Initiative and beyond.
In a way, [social and emotional learning] has become the thread between previously disparate efforts to support children in achieving their best possible future and as the foundation for skill building and academic success (Devaney & Moroney, 2018, p. 253).
DOWNLOAD A BRIEF, PRINTER-FRIENDLY PDF ON SOCIAL & EMOTIONAL LEARNING IN OUT-OF-SCHOOL TIME.
About evaluating out-of-school time initiatives
The Initiative evaluation was originally designed with a focus on utilization, with both formative and summative components and predetermined expectations about what "impact" would mean and look like. However, we also ended up employing developmental evaluation strategies that support adaptation. We would have benefited from planning to do so from the outset in anticipation that the Initiative would evolve. This is likely to be valuable advice for longer-term, complex initiatives, as initiatives are bound to change over the course of five or more years.
Throughout the Initiative, we have needed to adjust our expectations and evaluation plans to ensure that we were gathering and using information to support improvement for both the Initiative and participating programs.
For example, while we explored program quality data for patterns, we made the explicit choice not to use it to assess degrees of program improvement or change. Doing so would not have appropriate given how the data was collected and used for improvement and given that assessment scores tend to trend downward in the first few years as staff gain experience with the process and comfort with scoring honestly and critically. This choice also ensured that the evaluation and the Initiative could support authentic engagement and learning; using program quality data differently would have risked programs focusing on getting “the right” or “a better” score, rather than on reflection and improvement.
In 2017, we also decided not to continue collecting student-level data from participating programs. Our reasons included the substantial burden on program staff of capturing and sharing participation data, the challenges involved in matching those student records with educational system records and then tracking them until high school graduation, and a recognition that our work needed to focus on learning rather than research.
By 2017, we had captured student and participation data for three school years and felt that we had a sufficiently large and diverse data set to complete our desired analysis. We knew that it would be at least 2020 before we would find out whether even the oldest students included in our data were graduating at the same or a greater rate than their peers. We were also learning about what data was most useful to participating programs and realized that it would be far more valuable to focus on program quality data and other tools most immediately useful to program staff.
We also recognized that we didn’t need to capture student specific data to see the impact of programs on students, or to understand the value and effectiveness of the Initiative. Not only was the existing national research base in support of out-of-school time programming growing, but we had plenty of evidence of its value, and the value of the Initiative, through the perspectives of students, family and program staff—a strong compilation of equally valid qualitative data.
The Initiative evaluation has tested the limits of educational data available through state administrative records. Our analysis has illustrated how challenging it can be to understand the relationship between out-of-school time program participation and the state’s standardized educational outcomes.
Variations in what data are collected, school and district policies and procedures, community resources, pedagogy and instruction, and teacher characteristics are just some of the reasons why the results from analysis of standardized educational data are difficult to interpret. In addition, much of the information available about individual students is from specific, isolated points in time (e.g., eighth-grade test scores). While they might reflect the impact of learning conditions on students, these data are not directly focused on those conditions or the opportunities they have access to inside or outside of school.
Standardized testing is also widely criticized for being biased in favor of white and higher-income students, calling into question the validity of inferences about student learning based on those metrics (e.g., Au, 2016; Walton & Spencer, 2009). Studies have shown that test scores are driven more by factors like access to high-quality instruction and curricula than by student aptitude (e.g., Darling-Hammond, 2004). Unfortunately, data about the quality of instruction or other conditions for learning in the classroom systematically collected and shared, which greatly limits our ability to estimate the contribution of those factors to student achievement, let alone the contribution of out-of-school time programs.
Myriad other factors in students’ high school experiences contribute to whether students attend school or do well on standardized tests, many of which are well beyond the control of out-of-school time programs. In other words, if students served by these programs in middle school struggle similarly to their peers in high school, this can more likely be attributed to ongoing gaps in opportunity and other challenges in high school than to program shortcomings.
The impacts that are most proximal and useful to out-of-school time programs are the ones they have the greatest ability to influence, such as the skills, habits and mindsets they can instill while students are in programming, rather than what happens to students after they leave the program. Specific impacts will and should vary depending on each program’s focus.
The emphasis on standardized, longer-term educational measures is also misaligned with current out-of-school time evaluation practice, which is shifting its framing and measurement of out-of-school time impacts so that expectations align more closely with what is happening during programming and within programs’ locus of control. Focusing on the conditions that support student success (e.g., quality out-of-school time opportunities) and measures that indicate whether students are benefiting more directly from and during programming (e.g., by developing social and emotional learning skills) is far more useful than trying to isolate out-of-school time as a cause of test score shifts or other longer-term academic and workforce outcomes.
As the research base on the impact of out-of-school time programs and quality improvement has grown, we have found it increasingly useful to build programs’ capacity to measure quality, knowing that immediate information about practice is more helpful than delayed educational outcomes data. Although our original concept of student success did not rely entirely on long-term standardized educational outcomes, we have shifted toward systemic, proximal measures like social and emotional learning and program quality to understand the value of out-of-school time programs. This shift is reflected not only in the evaluation itself, but also in the learning community’s emphasis on improving program quality and building evaluation capacity.
Finally, reframing what we mean by “effective” out-of-school time or program “impact,” and resetting our expectations of how to measure it, can be done with the goal of furthering equity. We can look for evidence that is multiculturally valid and elevate the value of culturally specific knowledge. What do we expect student success to look like or mean? How do we know it when we see it? Answering these questions requires that we look beyond quantifiable measures to understand student success more holistically.
We can also attend more thoughtfully to the systems and drivers that shape whether students succeed in school and in out-of-school time programming. This helps us acknowledge the limitations of any one program to influence whether a student is successful and illuminates opportunities to shift or at least inform those structures. What can programs, especially those that are culturally specific, teach schools about how to shift educational structures to better support students?
Many of our findings align with existing research, including studies showing that high-quality out-of-school time benefits students academically and promotes social and emotional learning. While we couldn’t detect some of these specific impacts due to data and design limitations, our results also do not contradict these findings. In other words, our findings do not mean that out-of-school time programs in Oregon don’t have similar or lasting impacts. Rather, we believe that the measures available, especially standardized test data, don’t reflect those impacts.
Some of the challenges we faced in collecting and analyzing student data are not unusual. Out-of-school time programs and researchers have grappled with difficulties connecting program participation to consistent, clear, statistically significant changes in educational outcomes for many years, often due to similar data limitations. However, some of what we’ve done and learned through the evaluation reflects nascent trends or aspects of out-of-school time that are under-researched. Finally, some of our findings are the result of relatively distinct aspects of the Initiative’s design.
Researchers are increasingly focused on figuring out how out-of-school time can be most effective—what program quality looks like and how to increase it—and otherwise attending to the conditions that support student success, including the components of high-quality programming. We believe that by sharing what we’ve learned about the learning community, and the ways that programs have benefited from engaging in the learning community and the YPQ process, we are contributing to a growing body of work that seeks to understand what programs can do to improve quality.
Our efforts to measure and understand social and emotional learning also reflect still-emerging areas of research in out-of-school time. Whereas there are only a few program quality frameworks for out-of-school time (which largely overlap), there are dozens of frameworks for social and emotional learning. Researchers continue to struggle with how best to define social and emotional learning and to measure how and why growth occurs (American Institutes for Research, 2015). While there’s a great deal more to learn about how programs integrate and support social and emotional learning, and how that work relates to other student impacts, our evaluation illustrates the value of focusing energy on this area.
We also believe our findings relating to culturally specific programs are particularly noteworthy, providing an important contribution to the field given how little has been published about how, and how well, these programs support students.
Last, we hope that by highlighting the perspectives of students, families and staff, we are demonstrating the value and validity of this data. The data gathered through student surveys, photovoice projects, staff interviews and program observations overwhelmingly affirms the value of out-of-school time programs in supporting student success in school and beyond.
About the achievement and opportunity gaps
Through the Initiative, as well as through Oregon Community Foundation’s larger work on the opportunity gap, we now recognize that the achievement gap results from myriad factors, including gaps in opportunity that begin in the earliest years and continue through a student’s educational experience.
The opportunity gap commonly results from intentional policy decisions that reinforce long-standing patterns of resource access for middle-class, English-speaking, white families, while limiting access for other communities. Our understanding of who is most likely to face the opportunity gap and other systemic barriers—i.e., students of color, students from under-resourced rural communities, and students from low-income families—has been confirmed. We also have a richer understanding of the limitations on social and economic mobility present for many of these students because of where they are growing up, due to historical forces that will require generations to overcome.
While focusing on gaps in opportunity helps funders direct resources where they will be of the greatest benefit and are most likely to advance equity, we consider the concept of the achievement gap to be too deficit-based. Further, focusing on student “achievement” (or lack thereof) as the central problem distracts from systemic drivers of student success. A focus on measuring student achievement, especially using standardized educational measures that reflect systemically inequitable practices (e.g., disproportionate discipline and standardized tests), places the burden on students rather than on the broader forces that cause educational disparities. We are not suggesting that student achievement is without value as a measure of student progress in all circumstances. But we believe that using achievement measures without also attending to the systemic drivers of educational disparities will not help us achieve our ambitious goals for alleviating those disparities.
DOWNLOAD A BRIEF, PRINTER-FRIENDLY PDF ON REFRAMING OUT-OF-SCHOOL TIME IMPACTS.
As we share these findings, the Initiative team is making plans to review the Initiative’s goals and theory of change to reflect this evolution in our understanding of what students face and our thinking about what the Initiative can accomplish.
Ultimately, by funding out-of-school time programs that serve students experiencing the opportunity gap, and strengthening students’ access to high-quality programming, the Initiative is directly addressing the opportunity gap. The Initiative evaluation confirms that this is a useful step or strategy for addressing educational disparities. But we also must recognize that many other interventions, including system-level changes, will be necessary if we want to eliminate those disparities entirely.