Skip to main content

EDITORIAL article

Front. Psychol., 08 September 2023
Sec. Developmental Psychology
This article is part of the Research Topic Implementation of Social and Emotional Learning Interventions in Applied Settings: Approaches to Definition, Measurement, and Analysis View all 21 articles

Editorial: Implementation of social and emotional learning interventions in applied settings: approaches to definition, measurement, and analysis

  • 1Harvard Graduate School of Education, Harvard University, Cambridge, MA, United States
  • 2Early Childhood Innovation Network, Department of Psychiatry, Georgetown University Medical Center, Washington, DC, United States

Implementation matters for SEL intervention effectiveness

More than two decades of meta-analytic research documents the effectiveness of social and emotional learning (SEL) interventions for improving social-emotional competencies and longer-term academic outcomes, behavioral functioning, and mental health (Durlak et al., 2011, 2022; Jones et al., 2021; Cipriano et al., 2023). Implementation research suggests that outcomes are more robust when interventions are implemented with adherence to their intended model (Durlak and DuPre, 2008). In a meta-analysis of 213 studies of SEL interventions, programs implemented with fidelity produced greater improvements in children's outcomes than studies that reported challenges with implementation (Durlak et al., 2011).

There are a number of ways to characterize and measure implementation. Dane and Schneider (1998) defined five dimensions (adherence, exposure, quality, participant responsiveness, and program differentiation) of program integrity that have remained a part of most current definitions of implementation outcomes (Proctor et al., 2011). The term fidelity has emerged over time as a broader term with adherence and dosage as indicators within that dimension (Century et al., 2010; Proctor et al., 2011). Most studies of SEL program implementation assess fidelity or dosage while fewer focus on quality or participant responsiveness (O'Donnell, 2008; Berkel et al., 2011).

Several conceptual frameworks have been developed to illustrate the multiple factors at various ecological levels that influence the implementation process (Wandersman et al., 2008; Damschroder et al., 2009; Meyers et al., 2012). Domitrovich et al. (2008) developed a three-level ecological framework for organizing factors that relate specifically to the implementation of school-based interventions: macro-level factors (e.g., policies and financing, community capacity and empowerment), school-level factors (e.g., organizational functioning, school and classroom culture, and climate), and individual-level factors [e.g., psychological functioning (burnout and self-efficacy) and perceptions of and attitudes toward the intervention]. Several studies have empirically validated the importance of multi-level factors as predictors of SEL implementation (e.g., see Malloy et al., 2015; Domitrovich et al., 2019; Musci et al., 2019; Cramer et al., 2021).

Studies that include fidelity data often report variability at both the individual level between implementers and at the organizational level across the settings (e.g., schools) where programs are delivered, suggesting the need to more deeply understand associations between implementation and outcomes by examining potentially relevant individual, school, and community factors (Durlak and DuPre, 2008). While research underscoring the importance of implementation fidelity for SEL interventions is growing, it is rarely the primary focus of SEL research and is typically not measured or described in sufficient detail. However, we believe that some of what is relatively equivocal in research on SEL programs (e.g., Jones et al., 2019) can be addressed by documenting and understanding implementation with greater precision and depth. In this Research Topic, authors addressed a wide variety of topics including: (1) approaches to measuring SEL implementation and engagement, (2) applying innovative quantitative and qualitative methods, (3) investing in partnerships with practitioners, (4) multi-level factors predicting implementation, (5) equity in defining and measuring SEL and its implementation, and (6) understanding implementation in global contexts (see Table 1 for an overview of all the manuscripts included in this issue). In the sections below we describe key themes and ideas across the many papers in this Research Topic.

TABLE 1
www.frontiersin.org

Table 1. Overview of manuscripts included in the SEL implementation Research Topic.

Measuring SEL implementation and engagement

Two papers described comprehensive implementation measurement strategies that monitored implementation of both the intervention and the support system. Choles et al. developed a two-level conceptual implementation framework and aligned measures to capture fidelity of implementation of a mindfulness-focused SEL program for pre-school children. The model is innovative because it focuses on program implementation supports for teachers (e.g., coaching) as well as teacher implementation of the curriculum in the classroom. Martinsone et al. describe their approach to creating a system for monitoring the implementation of a comprehensive school-based mental health program for elementary and middle school students that included universal curriculum lessons as well as home-based activities. The implementation supports included teacher training and the development of school teams whose members helped ensure program fidelity and quality. The program was delivered in six countries and the authors paid close attention to capturing cultural adaptations as part of fidelity monitoring.

A number of papers proposed new approaches to measuring SEL implementation. Wu et al. designed an approach to capture nuanced features of implementation of non-curricular, flexible approaches organized as brief activities across SEL domains (Mindfulness and Brain Games in this study) in humanitarian settings. The paper presents three dimensions of dosage: quantity (how much), duration (for how long), and temporal pattern (how often). This approach can capture (1) how often activities targeting the same SEL domain are repeated and (2) how many activities are implemented before at least one activity is attempted from each available SEL domain—providing insight into patterns of intervention implementation and exposure, in addition to quantity and duration. Devlin et al. noted that most implementation measures focus on the implementer even though children's engagement during implementation likely influences children's outcomes. The authors developed a four-step protocol designed to capture active child engagement by observing children's behavior. The protocol focused on identifying points of active child engagement, operationalizing and measuring those dimensions, and analyzing the data by linking child engagement to other meaningful variables. Bodrova et al. discuss the importance of play as a context for SEL development during early childhood and the challenges of monitoring the fidelity of this activity when it is a component of an SEL intervention. They argue that play components of SEL interventions need to be made “visible” and that nuanced measures of play need to be developed so research can isolate the important characteristics of play that predict positive social and behavioral development over time. All three papers underscore the need to consider implementation as a multi-faceted, dynamic process that requires attention to multiple aspects of implementation (e.g., implementer and recipient, multiple intervention components, implementation context, etc.).

Applying innovative quantitative and qualitative methods

Studies employed a variety of innovative qualitative and quantitative analytic approaches. Two studies used latent profile analysis to create descriptive profiles (or categories) of implementation and explore associations between teacher/classroom profiles and children's outcomes. Zhao et al., used measures of dosage, adherence, quality of delivery, and student engagement to identify three latent profiles of implementation (high, moderate, and low). Classrooms with moderate- and high-level implementation practices showed significantly higher gains in student outcomes than those with low-level implementation. Similarly, Gómez et al. identified two latent profiles: below average implementation and above average implementation using measures of teacher responsiveness (teacher evaluation of the training sessions) and amount of exposure to implementation supports (ratings of coaching and time spent with coach). Teachers in the below average profile were less responsive to training and received less support than teachers in the above average profile. Using propensity scores, the authors found that more experienced teachers and teachers reporting lower levels of burnout were more likely to implement the intervention as intended.

Integrating SEL and youth participatory action research (YPAR), a form of critical participatory action research (CPAR), represents another novel and promising methodological approach (Meland and Brion-Meisels). In YPAR, youth are full participants in the research process and seen as the experts of their own lives and contexts. The authors propose a set of four core commitments as the mechanisms of YPAR that nurture SEL (e.g., democratic participation that centers youth expertise) and conceptualize SEL implementation integrity as adherence to a set of commitments rather than fidelity to a specific set of activities. This approach can provide more flexible ways to think about implementation that centers youth empowerment and voice.

We were also pleased to receive several papers using qualitative methods that focus on understanding individual perceptions and experiences. Using interviews, focus groups, and observations, Dyson et al. explore educators' views on SEL in a rural, high-needs elementary school setting. While educators “bought-in” to SEL, they reported lack of time, lack of preparation and development, home-school disconnection, and pushback from students as significant constraints. Another study used a mixed-methods approach to study the Early Childhood Mental Health Consultation pilot in Virginia. Partee et al. interviewed participants who chose not to participate in ECMHC or opted out after consultation began and conducted focus groups with participants who had sustained engagement. These qualitative papers provide perspectives from a wider set of voices often not included in traditional impact and implementation research including rural settings and those who opt not to participate in interventions.

Investing in partnerships with practitioners

Building on the themes of incorporating previously undervalued voices in new ways, a handful of studies centered partnerships with educators. Grant et al. shared findings from a multi-year research-practice partnership (RPP) designed to support SEL implementation in a district. The authors offer key lessons learned related to developing feasible and meaningful implementation measures, identifying structures that can support the collection and use of implementation data to improve practice, presenting data for various audiences, and creating systems for sustainable data use. Spacciapoli et al. conceptualize fidelity as part of an ongoing professional development feedback ecosystem. Teachers record short videos across the school year, review and reflect on their video, and receive targeted feedback from a coach. The method approaches fidelity of implementation as a developmental journey with the expectation that teachers will improve over time and develop a nuanced set of indicators across the school year. These examples demonstrate that mutually beneficial relationships between researchers, practitioners, and other stakeholders can create conditions for iterative cycles of design and testing and the development of sustainable systems of SEL implementation, data collection, and use.

Multi-level factors predicting implementation

A number of studies examined predictors of implementation. Ulla and Poom-Valickis conducted a systematic review and identified four categories of contextual factors that can influence implementation: program support, school, teacher, and student level factors. Their analysis focused on the relative importance of these factors and found that the most frequent statistically significant factors included modeling activities during coaching and teacher-coach working relationship. Teacher burnout was uniquely related to program dosage. In community-based childcare centers, Hunter et al. examined factors that predict the implementation of a comprehensive pre-school program that includes curricular components and teaching practices designed to promote the social-emotional development of young children. They found that baseline teaching practices and responsiveness to intervention (and not teacher education or experience) predicted quality of intervention activities and teaching strategy delivery. Workplace factors (e.g., classroom resources and job satisfaction) predicted multiple features of implementation. Braun et al. also examined workplace factors (i.e., occupational health) that predict the implementation of a universal social-emotional learning program implemented in elementary schools by early career teachers. The authors went a step beyond typical research examining the main effects of implementation predictors to test interactions among factors. They found perceptions of program feasibility moderated the relationship between job stress and implementation quality in unexpected ways. In a unique study of implementation predictors, Thierry et al. conducted secondary analyses of a national dataset that included information on school district and macro-level factors to explore factors associated with teacher and counselor-facilitated delivery of a universal social-emotional learning program conducted in elementary and middle schools. The studies included in this section highlight the multi-level factors that shape the conditions for SEL implementation in schools.

Equity in defining and measuring SEL and its implementation

Three papers included equity as a central component. Lin et al. examined how pre-service educators define SEL. Educators conceptualized SEL as broader than competency-based models. They instead considered SEL as an opportunity to respond to student and community needs, center humanity, and advance social justice. Participants advocated for a co-created, humanizing SEL approach that honors identity, promotes justice, and ultimately can dismantle inequitable systems. YPAR, the approach described by Meland and Brion-Meisels, elevates individual voices and empowers youth to engage in collective action that aims to disrupt systems of inequity and promote positive change in communities. Underlying the approach is a trusting, equitable, reciprocal relationship that remains a central part of the entire YPAR process. Finally, Spacciapoli et al. noted that measures of implementation often leave teachers “in the dark” about observation goals and items as well as implementation strengths and areas for improvement. The authors' transparent approach includes teachers as collaborators in implementation measurement and tracking by engaging them in observing and reflecting on their practice in order to create equitable partnerships.

Understanding implementation in global contexts

The focus of papers in this Research Topic describe SEL efforts across the globe including programming that was delivered in humanitarian settings in Sierra Leone (Wu et al.), community-based programs conducted in Rio de Janeiro, Brazil (McCoy and Hanno), and Colombia (Harker Roa et al.), and a comprehensive school-based mental health program implemented in several European countries (Italy, Latvia, Romania, Croatia, Greece, and Portugal; Martinsone et al.). The two papers describing SEL efforts in South America both discuss the challenges associated with delivering programs in extreme conditions including community violence, forced displacement, and extreme adversity and the importance of cultural adaptation and flexibility. Harker Roa et al. identified cultural adaptation as an implementation enabler that was successful when it was conducted intentionally following a structured protocol. They also found that with sufficient training and support, their parent-focused program could be delivered by paraprofessionals, an example of “task-shifting” that was necessary because of the shortage of mental health professionals in Colombia. McCoy and Hanno also identify culture as a key factor that influenced the delivery of their SEL programming in Brazil. Their perspectives on the importance of this and other macro-level factors including timing and government support came from their work delivering an SEL program in elementary schools and from their review of similar research conducted in low-income, conflict affected settings.

Considerations for the future

Put quite simply, the collection of articles in this Research Topic tell us unequivocally that the quality and quantity of implementation of social and emotional learning programs, strategies, and practices is the cornerstone to their efficacy. But that is not all these innovative and penetrating articles tell us. Indeed, they go beyond relatively “simple” questions of whether and how implementation factors make a difference and stretch our body of knowledge, pushing us to (1) consider new ways of measuring, operationalizing, and analyzing implementation data, (2) incorporate the perspectives of those often not represented in our implementation data, (3) expand our view well beyond our own borders and bring multiple contexts and settings into the broader body of work, and (4) sustain SEL by developing innovative support systems that address both individual and contextual factors that influence the process of implementation. In addition, these papers leave us with some directions to consider for future work. Among many possible directions for future research, we highlight three key questions that build from these papers and, in our view, are central to moving the field forward.

• What are the common and unique, multi-level, predictors of implementation quality and quantity that ultimately represent universal- and program-specific factors?

• Can we embed implementation data collection, reflection, and adapted practice into program design and delivery in ways that create meaningful improvements, rather than considering implementation as an added and separate effort?

• What do cross-cutting and persistent patterns of implementation (i.e., lower than expected dosage, fidelity, and overall engagement) suggest about potential changes required to program design, delivery expectations, and pre-implementation training?

Author contributions

SB: Writing—original draft, Writing—review and editing. CD: Writing—review and editing. SJ: Writing—review and editing.

Conflict of interest

The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

Publisher's note

All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article, or claim that may be made by its manufacturer, is not guaranteed or endorsed by the publisher.

References

Berkel, C., Mauricio, A. M., Schoenfelder, E., and Sandler, I. N. (2011). Putting the pieces together: an integrated model of program implementation. Prevent. Sci. 12, 23–33. doi: 10.1007/s11121-010-0186-1

PubMed Abstract | CrossRef Full Text | Google Scholar

Century, J., Rudnick, M., and Freeman, C. (2010). A framework for measuring fidelity of implementation: a foundation for shared language and accumulation of knowledge. Am. J. Eval. 31, 199–218. doi: 10.1177/1098214010366173

CrossRef Full Text | Google Scholar

Cipriano, C., Strambler, M. J., Naples, L. H., Ha, C., Kirk, M., Wood, M., et al. (2023). The state of evidence for social and emotional learning: a contemporary meta-analysis of universal school-based SEL interventions. Child Dev. 1–24. doi: 10.1111/cdev.13968. [Epub ahead of print].

PubMed Abstract | CrossRef Full Text | Google Scholar

Cramer, T., Ganimian, A., Morris, P., and Cappella, E. (2021). The role of teachers' commitment to implement in delivering evidence-based social-emotional learning programs. J. Sch. Psychol. 88, 85–100. doi: 10.1016/j.jsp.2021.08.003

PubMed Abstract | CrossRef Full Text | Google Scholar

Damschroder, L. J., Aron, D. C., Keith, R. E., Kirsh, S. R., Alexander, J. A., and Lowery, J. C. (2009). Fostering implementation of health services research findings into practice: a consolidated framework for advancing implementation science. Implement. Sci. 4, 50. doi: 10.1186/1748-5908-4-50

PubMed Abstract | CrossRef Full Text | Google Scholar

Dane, A. V., and Schneider, B. H. (1998). Program integrity in primary and early secondary prevention: are implementation effects out of control? Clin. Psychol. Rev. 18, 23–45. doi: 10.1016/S0272-7358(97)00043-3

PubMed Abstract | CrossRef Full Text | Google Scholar

Domitrovich, C. E., Bradshaw, C. P., Poduska, J. M., Hoagwood, K., Buckley, J. A., Olin, S., et al. (2008). Maximizing the implementation quality of evidence-based preventive interventions in schools: a conceptual framework. Adv. Sch. Mental Health Promot. 1, 6–28. doi: 10.1080/1754730X.2008.9715730

PubMed Abstract | CrossRef Full Text | Google Scholar

Domitrovich, C. E., Li, Y., Mathis, E. T., and Greenberg, M. T. (2019). Individual and organizational factors associated with teacher self-reported implementation of the PATHS curriculum. J. Sch. Psychol. 76, 168–185. doi: 10.1016/j.jsp.2019.07.015

PubMed Abstract | CrossRef Full Text | Google Scholar

Durlak, J. A., and DuPre, E. P. (2008). Implementation matters: a review of research on the influence of implementation on program outcomes and the factors affecting implementation. Am. J. Commun. Psychol. 41, 327. doi: 10.1007/s10464-008-9165-0

PubMed Abstract | CrossRef Full Text | Google Scholar

Durlak, J. A., Mahoney, J. L., and Boyle, A. E. (2022). What we know, and what we need to find out about universal, school-based social and emotional learning programs for children and adolescents: a review of meta-analyses and directions for future research. Psychol. Bull. 148, 765. doi: 10.1037/bul0000383

CrossRef Full Text | Google Scholar

Durlak, J. A., Weissberg, R. P., Dymnicki, A. B., Taylor, R. D., and Schellinger, K. B. (2011). The impact of enhancing students' social and emotional learning: a meta-analysis of school-based universal interventions. Child Dev. 82, 405–432. doi: 10.1111/j.1467-8624.2010.01564.x

PubMed Abstract | CrossRef Full Text | Google Scholar

Jones, S. M., Brush, K. E., Ramirez, T., Mao, Z. X., Marenus, M., Wettje, S., et al. (2021). Navigating SEL From the Inside Out: Looking Inside and Across 33 Leading SEL Programs: A Practical Resource for Schools and OST Providers (Revised and Expanded Second Ed. Elementary School Focus). New York, NY: The Wallace Foundation. Available online at: https://www.wallacefoundation.org/knowledge-center/Documents/Navigating-Social-and-Emotional-Learning-from-the-Inside-Out.pdf (accessed June 15, 2023).

Google Scholar

Jones, S. M., McGarrah, M. W., and Kahn, J. (2019). Social and emotional learning: a principled science of human development in context. Educ. Psychol. 54, 129–143. doi: 10.1080/00461520.2019.1625776

CrossRef Full Text | Google Scholar

Malloy, M., Acock, A., DuBois, D. L., Vuchinich, S., Silverthorn, N., Ji, P., et al. (2015). Teachers' perceptions of school organizational climate as predictors of dosage and quality of implementation of a social-emotional and character development program. Prevent. Sci. 16, 1086–1095. doi: 10.1007/s11121-014-0534-7

PubMed Abstract | CrossRef Full Text | Google Scholar

Meyers, D. C., Durlak, J. A., and Wandersman, A. (2012). The quality implementation framework: a synthesis of critical steps in the implementation process. Am. J. Commun. Psychol. 50, 462–480. doi: 10.1007/s10464-012-9522-x

PubMed Abstract | CrossRef Full Text | Google Scholar

Musci, R. J., Pas, E. T., Bettencourt, A. F., Masyn, K. E., Ialongo, N. S., and Bradshaw, C. P. (2019). How do collective student behavior and other classroom contextual factors relate to teachers' implementation of an evidence-based intervention? A multilevel structural equation model. Dev. Psychopathol. 31, 1827–1835. doi: 10.1017/S095457941900097X

PubMed Abstract | CrossRef Full Text | Google Scholar

O'Donnell, C. L. (2008). Defining, conceptualizing, and measuring fidelity of implementation and its relationship to outcomes in K−12 curriculum intervention research. Rev. Educ. Res. 78, 33–84. doi: 10.3102/0034654307313793

CrossRef Full Text | Google Scholar

Proctor, E., Silmere, H., Raghavan, R., Hovmand, P., Aarons, G., Bunger, A., et al. (2011). Outcomes for implementation research: conceptual distinctions, measurement challenges, and research agenda. Admin. Policy Mental Health Mental Health Serv. Res. 38, 65–76. doi: 10.1007/s10488-010-0319-7

PubMed Abstract | CrossRef Full Text | Google Scholar

Wandersman, A., Duffy, J., Flaspohler, P., Noonan, R., Lubell, K., Stillman, L., et al. (2008). Bridging the gap between prevention research and practice: the interactive systems framework for dissemination and implementation. Am. J. Commun. Psychol. 41, 171–181. doi: 10.1007/s10464-008-9174-z

PubMed Abstract | CrossRef Full Text | Google Scholar

Keywords: implementation, social and emotional learning (SEL), intervention, social and emotional learning (SEL) skills, measurement, schools

Citation: Barnes SP, Domitrovich CE and Jones SM (2023) Editorial: Implementation of social and emotional learning interventions in applied settings: approaches to definition, measurement, and analysis. Front. Psychol. 14:1281083. doi: 10.3389/fpsyg.2023.1281083

Received: 21 August 2023; Accepted: 29 August 2023;
Published: 08 September 2023.

Edited and reviewed by: Pamela Bryden, Wilfrid Laurier University, Canada

Copyright © 2023 Barnes, Domitrovich and Jones. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

*Correspondence: Stephanie M. Jones, jonesst@gse.harvard.edu

Disclaimer: All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article or claim that may be made by its manufacturer is not guaranteed or endorsed by the publisher.