Skip to main content

ORIGINAL RESEARCH article

Front. Psychol., 21 August 2023
Sec. Educational Psychology
This article is part of the Research Topic The Case of Social Emotional Learning: Evidence-Based Practices View all 17 articles

Validation of a community-based application of the Portuguese version of the survey on Social and Emotional Skills – Child/Youth Form

  • 1Department of Psychology, Center for Psychological Research and Social Intervention – CIS-IUL, ISCTE – University Institute of Lisbon, Lisbon, Portugal
  • 2Institute of Interdisciplinary Research, Center of Interdisciplinary Studies (CEIS20), University of Coimbra, Coimbra, Portugal

Introduction: Children and adolescents’ social and emotional skills have been gaining attention in diverse settings. With over 100 conceptual frameworks available, there is now a common move toward framing these skills as social and emotional learning (SEL), assuming that they are not only amiable to development, but also malleable to change as a product of intervention. As such, there is a strong need for a comprehensive measure to effectively evaluate such skills, validated for different age groups in children and young people, and applicable to both educational contexts and community settings.

Methods: This paper presents the validation of the Portuguese adaptation of the Child/Youth form of the Survey on Social and Emotional Skills (SSES), in the scope of the Gulbenkian Academies for Knowledge initiative with a sample of 7,831 participants between 8 and 17 years old (M = 11.79, SD = 2.94).

Results: Results show that the measure has good internal consistency and sensitivity, while also being sensitive to change over time. Preliminary factor analysis shows promise, although further research is necessary.

Discussion: Discussion reflects on the value of the Child/Youth form of the SSES as a comprehensive measure to be used by community and educational professionals to monitor skill development and improve their work on SEL.

Introduction

Social and emotional skills are a multidimensional construct that encompasses a set of intrapersonal competencies, important for the overall functioning of individuals, and interpersonal ones, essential for successfully interacting with others (Domitrovich et al., 2017).

Social–emotional learning (commonly referred to as SEL) is the process by which social and emotional skills are developed. According to Weissberg et al. (2015), it is through this process that knowledge, attitudes, and abilities are acquired, which are fundamental to managing emotions, achieving a set of goals, feeling and showing empathy for others, establishing and maintaining interpersonal relationships, and making responsible decisions.

Children and adolescents’ social and emotional skills have been gaining attention in diverse settings, including educational contexts and community settings. The importance of these skills is now also changing policy. For example, these skills have recently been included in the educational reference that guides education policies in Portugal – the Profile of Students Leaving Compulsory School (Martins et al., 2017).

The need for reliable and valid measurements of social and emotional skills is of crucial importance for evaluating SEL interventions. As Duckworth and Yeager (2015) state, measurement matters for many reasons: from a practical standpoint, changing certain competencies is easier when we can measure them, so data is important to inform action. Measurement also helps to inform progress monitoring, and to effectively evaluate SEL programs (Brush et al., 2022). For instance, the Data Wise model (Boudett et al., 2013) is an eight-step model that guides educational teams to improve teaching and learning by performing evidence-based analyses, motivating them to apply and systematize practices with the aim of articulating intervention and evaluation prior to implementation, and restructuring the intervention after its conclusion, based on collected data (Boudett et al., 2013).

However, there is great variability in available measures for social and emotional skill assessment, particularly in terms of the behavior they capture, the uniqueness of its constructs (as opposed to some degree of theoretical overlapping), in how comprehensive they are (measuring one single skill versus several dimensions and skills), the respondents they engage with (children, youth, parents, teachers) and format of response (questionnaires, observation measures, tasks; Humphrey et al., 2011). More importantly, SEL assessments tend to “vary greatly depending on the theoretical frameworks that underlie them” (Murano et al., 2021, p. 1).

This variability leads to diverse approaches, and often conceptual confusion, regarding how these skills are defined, how they translate to observable behavior and, consequently, how they are measured, leading most available instruments to be highly specific for the evaluation of a given intervention, or the measurement of a given skill (Martinez-Yarza et al., 2023). In a systematic review by Humphrey et al. (2011), 12 instruments for measuring SEL were found. The authors concluded that many of these measures were not being extensively used and disseminated, and were unevenly distributed across the targeted skills, for example, emotional skills were less evaluated than social skills. The authors also argued that most measures had only been validated for the United States and the United Kingdom, and only for non-diverse groups of children (Humphrey et al., 2011).

More recently, with the increased popularity of social and emotional learning approaches, a systematic review by Martinez-Yarza et al. (2023) identified 25 measures developed over a 20-year period, covering elementary through secondary education, usually targeting some SEL dimensions but not all of them. This review also shows that the most frequently used assessment method was indirect assessment relying mainly on Likert scales, suggesting there is the need for validating brief and user-friendly measurement measures, as well as for using a combination of multi-method and multi-informant assessment to effectively assess these skills (Martinez-Yarza et al., 2023). SEL measures also need to capture the dynamic interaction between individuals and their environment and context (Brush et al., 2022).

The study on social and emotional skills

Recognizing the central role of SEL at a young age for a healthy and successful adjustment throughout life, the Organization for Economic Cooperation and Development (OECD) developed the Study on Social and Emotional Skills (SSES). The development of the study was underlined by an effort to consolidate and disambiguate knowledge about how SEL skills develop in children and youth, and what aspects of children’s daily settings – family, school, community – promote or hinder this development.

The OECD’s Study on Social and Emotional Skills (and, subsequently, its survey) was framed under the Big Five model, following one of the most common frameworks for personality and skills. However, also following recent trends in the literature, it understands these skills as malleable, learnable and context dependent, as opposed to fixed traits of personality (Kankaraš and Suarez-Alvarez, 2019). The Big Five structure aims to provide a “general outline of how these skills are organized” (Chernyshenko et al., 2018, p. 9), since this five factor structure has been commonly found in personality and skills research in different cultures and settings (e.g., McCrae and Costa, 1997), including for children and young people (e.g., Tackett et al., 2012), and correlates highly with several outcomes throughout life (such as wellbeing, academic and professional success, or physical health; Chernyshenko et al., 2018).

Following a thorough review, the OECD opted for this Big Five structure to guide its approach to social and emotional skills, organizing its study in five dimensions: Collaboration, Task Performance, Emotional Regulation, Engagement with Others, Open-mindedness. These relate, respectively, with the classic Big Five domains of Agreeableness, Conscientiousness, Neuroticism, Extraversion, and Openness to Experience. Each dimension then encompasses several individual skills, which are the focus of the SSES.

Collaboration is understood as the ability to have sympathy toward others and express altruism, leading to better quality relationships and more pro-social behaviors; it includes the individual skills of Empathy, Trust, and Cooperation. Task performance relates to being self-disciplined and persistent, with a tendency to stay on task and to be a high achiever; it includes the individual skills of Responsibility, Self-control, and Persistence. Emotional regulation refers to what allows an individual to effectively manage negative emotional experiences and stressors, and it includes the skills of Stress resistance, Optimism, and Emotional control. Engagement with others refers to those who are extraverted, energetic, positive, and assertive, having an ease to establish social connections; it includes the skills of Sociability, Assertiveness, and Energy. Lastly, Open-mindedness stands as the will to accommodate different perspectives and new experiences, and includes the individual skills of Curiosity, Tolerance and Creativity (OECD, 2021).

As a product of the study, a comprehensive measure was developed, the Survey on Social and Emotional Skills, and was administered to over 60,000 participants of 10 and 15 years of age in 10 cities around the world, collecting data on 15 different social and emotional skills, as well as on sociodemographic, family, school, and community contextual characteristics, with data on students’ skills being reported by students, families, and teachers. SSES was implemented as the first large-scale international survey of SEL (OECD, 2021). Portugal was represented in this study by the Municipality of Sintra, contributing with over 3,000 participants, and thus constituting the sample for the initial Portuguese adaptation of this instrument.

This study, and the resulting survey, stand as a valuable effort to develop a comprehensive measure to the assessment of a broad array of social and emotional skills. The concern for the predictive value of the selected skills, the suitability of a Big Five approach to skills across different cultures and ages, and the comprehensive nature of the questionnaire support its suitability to measure these social and emotional skills, allowing for researchers and practitioners to further delve into the evidence-based promotion and evaluation of SEL.

Gulbenkian Academies for Knowledge

In 2018, the Portuguese Calouste Gulbenkian Foundation set out to implement a central mechanism for the development and support of innovative solutions for complex societal problems. In order to do so, the Foundation offered to co-fund intervention approaches to SEL, named Gulbenkian Academies for Knowledge (henceforth referred to as Academies), which could include a broad array of domains such as educational, science learning, health, civic participation, among others, under the common umbrella of developing social and emotional competencies of children and youth 0 to 25 years of age across the country. Between 2018 and 2022, the Foundation opened three rounds of applications (2018, 2019, 2020) in order to select 100 community-or school-based projects. Each project could be implemented across 1, 2 or 3 years. Because some Academies chose to test their intervention only in their second year of funding, there were in total 4 cohorts of Academies, across four school years.

To achieve the goal of promoting these skills, 35 Academies chose to implement intervention approaches previously validated using experimental or quasi-experimental evidence, and proven results (such as the Incredible Years Program), while the remaining 65 chose to develop and implement pilot approaches, i.e., innovative interventions, designed by each Academy, with the potential to be rigorously evaluated and validated as an effective intervention (N = 65). By integrating the GAK initiative, each Academy also committed to the implementation monitoring (Durlak and DuPre, 2008) and the experimental or quasi-experimental impact evaluation of its intervention, with the aim of contributing to the production of knowledge, and the dissemination of evidence-based interventions, without compromising the quality criteria necessary to these processes. All Academies were also recommended to involved at least 100 participants in their impact evaluation, in order to ensure some statistical power in their impact evaluation. Although this was not mandatory, it was strongly recommended, and most of the projects complied to this rule.

In addition to co-funding the intervention, the Foundation offered 100 selected programs the technical support of a Monitoring and Evaluation Team, which assisted Academies in all stages of their evaluation processes, including the development of their Theory of Change, closely monitoring various dimensions of program implementation, and designing experimental, quasi-experimental or descriptive impact evaluation studies, with standardized measures of intervention and control/comparison groups at pre-test and post-test. The M&E Team also provided continuous support and training opportunities to all Academies throughout the initiative. The training model, based on the Data Wise model (Boudett et al., 2005), focused on aspects related to monitoring (how to design a Theory of Change, how to observe program implementation, how to use program implementation monitoring data to improve interventions), impact evaluation (how to conceptually align intervention and evaluation, how to select evaluation measures, how to constitute intervention and control groups, how to analyze and discuss results), and ethical aspects inherent to research in the field. The M&E Team did so by providing training sessions, frequent individual consultancy, and visiting the Academies.

Of central importance to this study, is that the Foundation required the use of the SSES as a common metric of impact measurement across Academies. This means Academies were required to use SSES for pre-and post-test assessment of all participants in their evaluation. Because theories of change across Academies varied greatly, and the Foundation wanted to fund intervention approaches with a clear goal, Academies could choose a minimum of two SSES competencies to monitor across evaluation stages. Moreover, no items from the Energy subscale could be used because this skill was not aligned with the theoretical scope of the Foundation work. Academies could complement their evaluation work with other standardized measures of assessment.

The present study

This study aimed to examine the psychometric properties of the Portuguese version of the Child/Youth Form of the Survey on Social and Emotional Skills (SSES; OECD, 2021) used within the Gulbenkian Academies for Knowledge initiative with a large Portuguese community sample. Specifically, we aimed to: (1) to test the internal consistency of the SSES – Child/Youth Form, by analyzing its internal consistency; (2) to test the scale’s sensitivity to changes in participants’ social and emotional skills between pre-test and post-test; and (3) to explore the factor structure of the SSES – Child/Youth Form.

Method

Sample

The study sample included participants from 43 Academies, 5 from validated approaches and 38 from pilot approaches. The requirement to use the SSES as a common impact measure was implemented starting in the second cohort of Academies, because SSES was not available prior. However, due to the low quality and quantity of data from the 2nd edition (2019–2020), which was severely impacted by the outburst of the COVID-19 pandemic midyear, data from these 43 Academies which implemented the SSES Child/Youth Form comes from the third and fourth cohort only (2020–2021 and 2021–2022, respectively). Academies which chose not to administer the SSES in any of its forms, or that only administered its Parent or Teacher Forms, have also been excluded from the present study. Finally, only participants between the ages of 8 and 17 years old were included in this study sample, aiming for testing the validity of this instrument for participants 2 years older and 2 years younger than the participants in both cohorts from the original OECD study (10-and 15-year-old cohorts).

This means inclusion criteria for participants in this study comprised all participants from the two final cohort years of the Gulbenkian Academies for Knowledge initiative between the ages of 8 and 17 years old, with available information on age and sex, as well as with responses on SSES – Child/Youth Form either at pre-test or at post-test. This sample comprises 7,831 participants, 52% of which were female, and with ages ranging from 8 to 17 years old (M = 11.79, SD = 2.94). Mean school grade was the 6th grade (M = 6.15, SD = 2.85), and the majority of participants (74%) were Portuguese. As for family characteristics, both parents were predominantly Portuguese (85% of mothers and 86% of fathers), and their highest educational level was, on average, high school, although mothers scored higher (mother’s educational level M = 3.95, SD = 1.14, father’s educational level M = 3.69, SD = 1.211). Most families lived in an urban setting (84%), with a fifth (19%) benefitting from some form of social assistance by social security services (Table 1).

TABLE 1
www.frontiersin.org

Table 1. Participants’ sociodemographic characteristics.

Measure

The SSES – Child/Youth form (OECD, 2021) is a self-report instrument composed of 120 items, answered in a scale of one (Totally disagree) to five (Totally agree), which allows the assessment of a set of 15 social and emotional skills by child or youth participants aged between eight and 17 years old. It includes the following 15 subscales, with eight items each: Optimism (OPT; “I look at the bright side of life”), Responsibility (RES; “I am a responsible person”), Curiosity (CUR; “I like learning new things”), Self-control (SEL; “I stop to think before acting”), Emotional control (EMO; “I stay calm even in tense situations”), Cooperation (COO; “I get along well with others”), Sociability (SOC; “I make friends easily”), Assertiveness (ASS; “I enjoy leading others”), Creativity (CRE; “I have a good imagination”), Resilience/Stress resistance (STR; “I am relaxed and handle stress well”), Persistence/Perseverance (PER; “I make sure that I finish tasks”), Empathy (EMP; “I know how to comfort others”), Tolerance (TOL; “I like hearing about other cultures and religions”), Trust (TRU; “I believe most people are kind”) and Energy (ENE; “I am full of energy”). The survey could be administered in paper format or online format. Data from the global sample of SSES’s main study by OECD (2021) indicates Cronbach’s alpha’s internal consistency levels between 0.71 (Empathy) and 0.85 (Assertiveness).

Procedures

Data collection

Data was collected directly by each Academy’s team with their participants, having selected the appropriate mechanisms to the specific needs of its setting and sample. However, Academies adopted common data collection and management procedures, as well as ethical procedures, and were closely monitored by the Monitoring and Evaluation team. Therefore, all Academies were required, prior to assessment, to collect informed consent from each participant’s legal tutor, prepare data collection materials (paper versions or online versions of each measure), and prepare adequate locations (e.g., classrooms, community facilities).

Since each Academy would select the SSES subscales that best aligned with their Theory of Change, i.e., that evaluated the social and emotional skills targeted by their intervention, there is great variability in sample size for each subscale. Additionally, regarding pre-test and post-test scores, there is a decrease in sample size across subscales due to missing data: respondents may only have participated in one of the data collection moments, with participant mortality being common at post-test.

Data collection procedures could be managed and implemented by any adequately trained member of the Academy’s team, including teachers, social and youth workers, psychologists, researchers, among others, with supervision. In some instances (particularly with adult participants and/or with the comparison or control groups), the materials were provided, and the participant responded autonomously to the measures. Data was then submitted by the Academies to the M&E Team for cleaning and analysis.

Regarding ethical procedures, aside from the aforementioned written informed consent collected from legal tutors, all Academies were instructed to collect oral assent prior to assessment, and debrief underaged participants of study goals and procedures. Moreover, all data collection and analysis procedures ensured confidentiality, with each participant being granted an ID by their Academy’s team, meaning all data was fully anonymous to members external to the Academy, including the monitoring and evaluation team. The M&E team also granted regular ethics and data protection awareness training sessions to all Academies, and provide countless session of mentoring. All Academies whose data is included in this paper granted their approval for it to be processed and published for this purpose by the M&E Team via signed informed consent.

Data analysis

To test the scale’s internal consistency, we calculated Cronbach’s alpha for each subscale and for the overall score. We also observed central tendency measures (i.e., mean), dispersion measures (i.e., standard deviation), and the normality of variables was verified by analyzing asymmetry (skewness) and tailedness (Kurtosis) for each item. To test sensitivity to change over time, we conducted a t-test for differences between paired samples to analyze differences in scores between pre-test and post-test at the subscale level and in overall score. Effect sizes and correlations between pre-test and post-test measures were also calculated; Cohen’s d measure of standardized mean difference was calculated to attest the effect size on all subscales, whereas correlations between pre-test and post-test aimed to assume that scores from both data collection points positively relate to each other. Finally, a preliminary exploratory factor analysis (EFA) was conducted to explore the factor structure of the SSES – Child/Youth Form. Following (Smith-Donald et al., 2007), we used principal component extraction for the 112-item version of the Child/Youth Form of the SSES, i.e., the original 120-item version, excluding the 8 items from the Energy subscale. After confirming the suitability of the data via the Kaiser-Meyer-Olkin (Hutcheson and Sofroniou, 1999) test and Bartlett’s test of sphericity (Dziuban and Shirkey, 1974), a preliminary exploratory factor analysis was conducted with the 112 items to explore the factor structure of the SSES – Child/Youth Form. Resulting components were rotated obliquely using Promax to allow correlation between factors. Cronbach’s alpha was calculated for each emerging construct and provides an index of internal consistency based on the average of the items scores in the construct. We used IBM SPSS, Version 28.0 for the analyses.

Results

Internal consistency of the SSES – Portuguese Child/Youth form

Table 2 shows Cronbach’s alpha for each subscale, and correlations between subscales and overall score for the SSES – Portuguese Child/Youth Form at pre-test. Internal consistency levels were overall good, ranging from 0.697 (Empathy) to 0.903 (Persistence/perseverance), while the overall scale showed an excellent level of internal consistency (ɑ = 0.951). Moderate to high correlations were found for 12 subscales, ranging between 0.627 (Tolerance) and 0.881 (Persistence/Perseverance). Only two subscales (Assertiveness, r = 0.403; and Resilience/Stress Resistance, r = 0.480) show low yet significant correlations with the overall scale.

TABLE 2
www.frontiersin.org

Table 2. Cronbach’s alpha, correlations and overall score for SSES – Child/Youth form.

Sensitivity of the SSES

Descriptive statistics and overall score

Supplementary Table 1 presents descriptive information for the items, subscales, and overall score for the SSES – Child/Youth Form at pre-test.

Overall mean results at pre-test ranged between M = 1.96 (SD = 1.09, Resilience Item 3) and M = 4.61 (SD = 0.65, Sociability Item 3) at the item level (overall score M = 3.49, SD = 0.67), and at the subscale level between M = 2.67 (SD = 0.89, Assertiveness) and M = 4.15 (SD = 0.53, Cooperation). All items were scored between one and five, with 96 items (85.7% of total) average scoring above the scale’s median. Kurtosis and skewness values for most items presented data skewed to the right and mostly peaked, suggesting a concentration of scores toward the higher end of the scale for most subscales.

Sensitivity of the SSES to change over time

Table 3 illustrates differences in SSES – Portuguese Child/Youth Form scores between pre-test and post-test. As shown previously in Supplementary Table 1, overall score at pre-test was M = 3.49 (SD = 0.67), with subscale mean scores ranging from M = 2.67 (SD = 0.89, Assertiveness) to M = 4.15 (SD = 0.53, Cooperation). At post-test, no subscales showed a statistically significant higher mean score than at pre-test, whereas seven subscales showed statistically significant differences in the opposite direction (i.e., with participants scoring higher at pre-test): Curiosity, Responsibility, Optimism, Self-control, Cooperation, Sociability, and Trust. This is also true for differences between overall scale scores, with a statistically significant decrease in the score between pre-test and post-test. This indicates that participants self-assessed their social and emotional skills higher (and scoring highly in the 5-point scale) at pre-test, before receiving any intervention. Effect sizes ranged between −0.038 (Tolerance) and 0.290 (Responsibility and Resilience/Stress Resistance) at the subscale level. Correlations between scores at pre-test and post-test were moderate and significant for most subscales, as well as for the overall scale, with correlations ranging from 0.606 (Emotional Control) to 0.727 (Sociability). Exceptions were found for the subscales Responsibility (r = 0.440), Empathy (r = 0.565), Creativity (r = 0.590), and Self-control (r = 0.599), although all are statistically significant.

TABLE 3
www.frontiersin.org

Table 3. Differences between pre-test and post-test scores for SSES – Child/Youth form’s subscales.

Validity of the SSES

Factor structure of the SSES – Child/Youth form

Initial Kaiser-Meyer-Olkin test (KMO = 0.855) and Bartlett sphericity test (Bartlett, χ2 (6216) = 21,893,901, p < 0.001) confirmed the adequacy of data to perform a factor analysis (Pasquili, 1999; Tabachnick and Fidell, 2007). Initial confirmatory factor analysis on the 112 items of the SSES – Child/Youth form indicated 25 components with eigenvalue >1 (Kaiser rule), accounting for 69.43% of variance explained. However, not only was this structure difficult to interpret given the underlying theoretical framework, but also most of the components’ variance weights were too low (e.g., <2%). Since we were using data pertaining to only 14 of the 15 original subscales, we chose not to force the extraction of a fixed number of factors drawn from the original instrument. Thus, we then forced the extraction of 10 components – due to it cumulatively accounting for over 50% of total variance explained (e.g.: Marôco, 2018). In the newly obtained factorial structure, one item (Curiosity – item 6) did not load onto any component (with a value over 0.32; Tabachnick and Fidell, 2007). For items loading in more than one component, we opted to maintain them where the load was higher, as long as the difference in scores was above 0.2 (Pereira and Patrício, 2008). However, that difference was inferior to 0.2 in 14 items, which led to their exclusion. Excluded items belong to the subscales Cooperation (items 2 and 4), Creativity (item 6), Emotional control (items 2 and 6), Empathy (items 3 and 6), Optimism (item 1), Responsibility (item 5), Self-control (item 8), and Sociability (items 2, 3, 5 and 6).Factor analysis was then redone under the same rules, including a final version of 98 items loading into 10 components. This final structure explained 52.38% of total variance, and 69.1% of items presented good to excellent loads (i.e., >0.5; Comrey and Lee, 1992). Four items (Curiosity – item 4, Cooperation – item 3, Sociability – item 4, and Empathy – item 5) did not load onto any factor. Table 4 summarily presents the final structure of 10 components, and the corresponding load of the 94 items to its main component, whereas Supplementary Table 2 provides the detailed results of this analysis, with all items’ loads to all components. The final 10 components were named as follows: Component 1 – “Perseverance and Responsibility,” since it includes being persistent, responsible and able to control task related behavior (with items pertaining to the original Persistence/Perseverance – 8 items, and Responsibility – 5 items subscales); Component 2 – “Curiosity and Tolerance toward diversity” (since it includes all 8 items from the original Tolerance subscale, and 2 from Curiosity), relating to openness to different contexts and people; Component 3 – “Relations with others,” which includes mostly items from the Cooperation (5 items) and Empathy (5 items) subscales, as well as one item from Responsibility subscales, all relating to one’s ability to collaborate with others, and maintain positive relationships; Component 4 – “Emotional Control and Emotional Resilience,” addressing the ability to manage and control emotions, particularly when facing distressful situations (with items from the original Emotional control – 6 items, Resilience/Stress resistance – 6 items, and Self-control – 1 items subscales); Component 5 – “Assertiveness/Leadership,” which is composed of all 8 items from the original Assertiveness subscale; Component 6 – “Trust in others,” which includes all 8 items from the original Trust subscale, relating to one’s capacity to believe in other people’s good intentions; Component 7 – “Social optimism,” composed of the remaining 7 items from the original Optimism subscale, and three from Sociability subscale, relating to one’s positive outlook on life and on starting and maintaining social relations, have friends and an active social life; Component 8 – “Care and concern for learning,” which includes 6 items from the original Self-control subscale, 4 from Curiosity and two from Resilience/Stress resistance, related to one’s eagerness to learn, emotional concern and care in performing tasks; Component 9 – “Creativity – Imagination,” including 3 items from the original Creativity subscale and one from Curiosity, relating to the ability to fantasize and imagine new scenarios; and Component 10 – “Creativity – New solutions,” which includes 4 items from the original Creativity subscale and one from Responsibility, related to the ability to come up with new ideas and original solutions.

TABLE 4
www.frontiersin.org

Table 4. Factorial structure for the SSES – Child/Youth form (summarized results).

Correlations between the final constructs and the overall scale were all significant except for component 10 related to CreativityNew solutions (r = 0.094), ranging between 0.352 (component 8 – Care and concern for learning) and 0.692 (component 1 – Perseverance and Responsibility). Internal consistency levels, as determined by Cronbach’s alpha, were overall good, ranging between 0.677 and 0.926, except for the Creativity – New solutions component (ɑ = 0.449).

Discussion

This paper aimed to validate the Portuguese Child/Youth form of the Survey on Social and Emotional Skills (OECD, 2021) as a reliable, comprehensive self-report measure of a large set of social and emotional skills based on OECD approach to social emotional learning. Particularly, we did so with a large Portuguese sample, testing the measure in diverse community and educational settings, and with a heterogeneous set of participants in a national initiative aimed at supporting the implementation of social and emotional intervention programs – the Gulbenkian Academies for Knowledge. The large sample size and the representativeness of the sample stand as notable strengths in this study.

Internal consistency results were good for most subscales, and excellent for the overall scale, indicating a good internal consistency for this version of the scale. Correlations between each subscale and the overall score were moderate to high (except for the Assertiveness and Resilience/Stress Resistance subscales), indicating the different dimensions are bound by a common underlying construct related to social and emotional skills.

Descriptive results both at the item-level and subscale level indicate overall high scores at pre-test for most subscales, as well as for the overall scale. Mean scores are similar to those found by OECD in its original study (OECD, 2021), with the subscales Cooperation and Curiosity scoring the highest, and subscales Assertiveness and Resilience/Stress Resistance showing the lowest scores.

Results on the sensitivity of the measure to change over time show the SSES can be used to measure the impact of educational and community interventions focused on social and emotional skills for children and youth in a variety of settings in Portugal. The decrease in scores for most subscales found at post-test may relate to a phenomenon well documented in the literature, with participants perceiving themselves as less competent in terms of their social and emotional skills as a result of explicitly discussing them in interventions (Martinsone et al., 2022). As for the effect sizes found in testing differences between pre-test and post-test, several reviews on social and emotional learning have confirmed these programs tend to generate small effects sizes (e.g., Payton et al., 2008; Clarke et al., 2015; Tanner-Smith et al., 2018), leading to discuss the suitability of these commonly used standards (i.e., effects sizes) for attesting the efficacy of these interventions.

Self-report measures report typical behaviors, thoughts, and feelings (OECD, 2021). As Duckworth and colleagues (2015) point out, they are better suited than other types of measures for assessing internal psychological states. Such type of measure also promotes children’s voices as they provide information about themselves (Gedikoglu, 2021).

Preliminary exploratory factor analysis results suggest a 94-items, 10-components structure for the Child/Youth Form of the SSES. Although some factors clearly maintain the structure from the subscales from the original study (e.g., Assertiveness, Trust), other seem to suggest the combination of two (or more) original subscales as a unified construct (such as Perseverance and Responsibility in component one), suggesting some shared meaning between how these skills are measured by the SSES. Additionally, some items did not load onto any factor, suggesting they may not share meaning with other items previously organized in the same subscale.

However, suggesting the usage of this 94-tem, 10 component structure is precocious. As previously stated, the SSES was tested in 10 different countries, with different social and cultural contexts, providing a cross-cultural comparability. Research has shown there is consensus regarding the main domains of social and emotional skills, their meaning, and how they translate to daily behavior across different cultures around the world (e.g., Chernyshenko et al., 2018), even though cultural incomparability would also be expected (OECD, 2021). Our results, which differ from the 15 subscales structure from the original SSES – Child/Youth form scale, can be due to cultural norms, values or references that provide different meanings to the same concepts (Jager et al., 2018). In the present study, variability in student characteristics within our sample may be a relevant factor. The original SSES study sample in Portugal was from the municipality of Sintra – a mostly urban, culturally diverse city in the greater Lisbon area –, meaning students were likely to share similar community and school settings. In our study, however, a more diverse national sample was used (e.g., mostly rural versus mostly urban settings; high versus low rates of cultural diversity; diverse school and afterschool experiences), leading to the possibility of greater variability within the data. Although more research is needed on the factor structure of this measure, to provide further insight on the suitability of its structure, these results also shed some light on the fact that SEL interventions should have a culturally responsive approach (Hill, 2019).

Secondly, the larger age gap (and the larger number of participants below and above the ages of 10 and 15) may also impact the data, and the factorial structure found in this study. Indeed, research has shown that age is one of the most relevant individual characteristics to impact social and emotional skills, and that these skills develop at different rates, are understood, and translate into behavior differently for different ages (e.g., Denham et al., 2009). More research is necessary on the factor structure of this measure, in order to provide further insight on the suitability of its structure.

Limitations and recommendations for future research

Although representing an important step toward the use of the SSES as a practical measure of social and emotional skills for professionals, this work faces its limitations. Firstly, the heterogeneous sample (in terms of children/adolescent and family characteristics) may hinder the study of its psychometric properties, as it adds variability due to participants’ characteristics. For instance, it would be beneficial if analysis were made for separate age groups, since we know these skills develop differently throughout childhood and adolescence (OECD, 2021), and may be understood differently by participants of different ages. Future research should explore the validity of this instrument for different age groups in a more detailed manner than in the present study or in the OECD’s SSES report (OECD, 2021). Additionally, when addressing its applicability to different age groups, one of the instrument’s limitations is its inadequacy for children under 8 years old, both due to literacy constraints and to the conceptual framing of the included skills for children of young age. This is similar to what occurs with other SEL measures, as noted by Martinez-Yarza et al. (2023).

Heterogeneity is also present in the data collection procedures employed by the different Academies. Despite there being a script, and protocol recommendations for the administration of the SSES by the Academies, each team adjusted the data collection process to its context and participants. This inevitable diversity in procedures was necessary, in order to better meet the needs and characteristics of each specific intervention, target population, and implementation team. However, it also accounts for some heterogeneity in who administered the survey (a teacher, a facilitator, older participants responded autonomously at home), the report format (online or paper), or the setting in which it took place (the classroom, at home, during one of the program sessions). This stands as a limitation to the quality of the data and could have an impact on the validation of the measure.

Because no other instrument to measure social and emotional skills was administered in the GAK context with a comparable sample – both in size and in characteristics – to the SSES, no analysis on concurrent or convergent validity were conducted. This stands as an important limitation, and a strong recommendation for future research using the SSES.

Further research is necessary on the SSES – Child/Youth form’s factorial structure, since the solution found in this paper is not clear from a theoretical perspective and does not present a great improvement of the instrument’s psychometric properties when compared to its original structure. The fact that the Energy subscale was not included in this validation study also stands as a limitation, since there was no data that allowed us to test the validity of the complete 120-item version of this Child/Youth Form.

Similarly, future research should take into consideration individual differences on how these skills develop, to better understand the effectiveness of its measures. Participants’ sex and age, for instance, are key features for the development of social and emotional skills, since research has found individual differences based on these two variables (OECD, 2021). The same is true for family characteristics, such as mother’s educational level, since it is related to socioeconomic status (e.g.: Aarø et al., 2009) and to the child’s success through life (e.g.: Akram and Pervaiz, 2020). The child’s socioeconomic status is also related to differences in the development of social and emotional skills (OECD, 2021).

It is also necessary to validate SSES’s two other forms – for parents and for teachers, also available in Portuguese – as valuable measures of children’s social and emotional skills when reported by meaningful adult figures in their daily lives. Triangulation of informants, by combining the perspectives of children/youth and others around them, ensures greater rigor, quality, and reliability in evaluating these skills, allowing to form a more detailed picture on social and emotional learning and development (Kankaraš et al., 2019). The same is true for methodological triangulation, suggesting the usage of measures beyond self-report and others-report, such as observational tools or situational judgment tests (e.g., Abrahams et al., 2019; Murano et al., 2021).

Conclusion

The purpose of this paper was to add evidence on a valuable measure for educational, social and community practitioners for evaluating social and emotional skills in their target audiences, as well as the effectiveness of their SEL interventions. Our results, strengthened by a very large and representative national sample, contribute to prove the utility of this measure for educational and community practitioners to inform and guide their works on social and emotional skills with a varied set of participants, adequately measuring their needs and their strengths. It is particularly useful given the diversity of available instruments under different conceptual frameworks and which focus on a specific skill, or subset of skills. The SSES was developed as a comprehensive measure for a large set of social and emotional skills, anchored in a sound, common theoretical framework.

Data availability statement

The datasets presented in this article are not readily available because the data pertains to several different institutions, being each institution’s responsibility to grant access. Requests to access the datasets should be directed to cGdjb25oZWNpbWVudG9AZ3VsYmVua2lhbi5wdA==.

Ethics statement

Ethical approval was not required for the study involving human samples in accordance with the local legislation and institutional requirements because the paper presents only secondary data analysis of data collected by third parties and was completely anonymized prior to analysis. All third parties guaranteed data anonymity, thereby protecting participant identity and maintaining privacy. Written informed consent for participation in this study was provided by the participants’ legal guardians/next of kin.

Author contributions

CCa conducted data cleaning, data analysis, and wrote sections of the manuscript. JA contributed to the introduction and discussion sections of the manuscript and conducted a final revision. CB conducted data cleaning, contributed to data analysis and interpretation, and a review of the manuscript. CCo conducted data cleaning and a final revision of the manuscript. All authors contributed to the article and approved the submitted version.

Funding

CCa was funded by the Portuguese Foundation for Science and Technology (FCT) doctoral grant (UI/BD/154443/2022). CB was funded by the FCT through the CEEC institutional funding (CEECINST/00126/2021).

Acknowledgments

The authors would like to thank all Academies, their participants and families, as well as the Calouste Gulbenkian Foundation, for contributing to making this work possible.

Conflict of interest

The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

Publisher’s note

All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article, or claim that may be made by its manufacturer, is not guaranteed or endorsed by the publisher.

Supplementary material

The Supplementary material for this article can be found online at: https://www.frontiersin.org/articles/10.3389/fpsyg.2023.1214032/full#supplementary-material

Footnotes

1. ^Scores were obtained by categories related to the Portuguese schooling system: 0 = Cannot read or write; 1 = up to the 4th grade, 2 = up to the 6th grade, 3 = up to the 9th grade, 4 = up to the 12th grade, 5 = university degree.

References

Aarø, L. E., Flisher, A. J., Kaaya, S., Onya, H., Namisi, F. S., and Wubs, A. (2009). Parental education as an indicator of socioeconomic status: improving quality of data by requiring consistency across measurement occasions. Scand. J. Public Health 37, 16–27. doi: 10.1177/1403494808086917

CrossRef Full Text | Google Scholar

Abrahams, L., Pancorbo, G., Primi, R., Santos, D., Kyllonen, P., John, O. P., et al. (2019). Social-emotional skill assessment in children and adolescents: advances and challenges in personality, clinical, and educational contexts. Psychol. Assess. 31, 460–473. doi: 10.1037/pas0000591

CrossRef Full Text | Google Scholar

Akram, S., and Pervaiz, Z. (2020). Mother’s education as a predictor of Individual’s opportunities to learn and earn. Element. Educ. 19, 879–884. doi: 10.17051/ilkonline.2020.04.196

CrossRef Full Text | Google Scholar

Boudett, K. P., City, E. A., and Murnane, R. J. (2013). Data Wise: A Step-by-Step Guide to Using Assessment Results to Improve Teaching and Learning. Harvard Education Press.

Google Scholar

Boudett, K. P., Murnane, R. J., City, E., and Moody, L. (2005). Teaching educators how to use student assessment data to improve instruction. Phi Delta Kappan, 86, 700–706.

Google Scholar

Brush, K. E., Jones, S. M., Bailey, R., Nelson, B., Raisch, N., and Meland, E. (2022). Social and emotional learning: From conceptualization to practical application in a global context. Life skills education for youth: Critical perspectives. 43–71.

Google Scholar

Chernyshenko, O., Kankaraš, M., and Drasgow, F. (2018). Social and emotional skills for student success and well-being: conceptual framework for the OECD study on social and emotional skills (OECD Education Working Papers No. 173), OECD Education Working Papers. Paris: OECD Publishing.

Google Scholar

Clarke, A.M., Morreale, S., Field, C.-A., Hussein, Y., and Barry, M.M. (2015). What works in enhancing social and emotional skills development during childhood and adolescence? A review of the evidence on the effectiveness of school-based and out-of-school programmes in the UK. A report produced by the World Health Organization Collaborating Centre for Health Promotion Research, National University of Ireland Galway. Available at: https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/411492/What_works_in_enhancing_social_and_emotional_skills_development_during_childhood_and_adolescence.pdf (Accessed June 6, 2022).

Google Scholar

Comrey, A. L., and Lee, H. B. (eds.). (1992). Interpretation and application of factor analytic results. A first course in factor analysis, 2.

Google Scholar

Denham, S. A., Wyatt, T. M., Bassett, H. H., Echeverria, D., and Knox, S. S. (2009). Assessing social-emotional development in children from a longitudinal perspective. J. Epidemiol. Community Health 63, i37–i52. doi: 10.1136/jech.2007.070797

CrossRef Full Text | Google Scholar

Domitrovich, C., Durlak, J., Staley, K., and Weissberg, R. (2017). Social-emotional competence: an essential factor for promoting positive adjustment and reducing risk in school children. Child Dev. 88, 408–416. doi: 10.1111/cdev.12739

CrossRef Full Text | Google Scholar

Duckworth, A. L., and Yeager, D. S. (2015). Measurement matters: Assessing personal qualities other than cognitive ability for educational purposes. Res. Educ. 44, 237–251. doi: 10.3102/0013189X15584327

CrossRef Full Text | Google Scholar

Durlak, J., and DuPre, E. (2008). Implementation matters: a review of research on the influence of implementation on program outcomes and the factors affecting implementation. Am. J. Community Psychol. 8, 327–350. doi: 10.1007/s10464-008-9165-0

CrossRef Full Text | Google Scholar

Dziuban, C. D., and Shirkey, E. C. (1974). When is a correlation matrix appropriate for factor analysis? Some decision rules. Psychol. Bull. 81, 358–361. doi: 10.1037/h0036316

CrossRef Full Text | Google Scholar

Gedikoglu, M. (2021). Social and emotional learning: an evidence review and synthesis of key issues. London, UK: Education Policy Institute.

Google Scholar

Hill, M. (2019). Culturally responsive social and emotional learning (SEL). Inspired Ideas. Available at: https://medium.com/inspired-ideas-prek-12/culturally-responsive-social-and-emotional-learning-be7fb6e3d58d (Accessed July 28, 2023).

Google Scholar

Humphrey, N., Kalambouka, A., Wigelsworth, M., Lendrum, A., Deighton, J., and Wolpert, M. (2011). Measures of social and emotional skills for children and young people: a systematic review. Educ. Psychol. Meas. 71, 617–637. doi: 10.1177/0013164410382896

CrossRef Full Text | Google Scholar

Hutcheson, G.D., and Sofroniou, N. (1999). The multivariate social scientist: introductory statistics using generalized linear models. Thousand Oaks, CA: SAGE.

Google Scholar

Jager, R. J., Rivas-Drake, D., and Borowski, T. (2018). Equity & Social and emotional learning: a cultural analysis. Framework briefs: Special issue series. Measuring SEL – Using Data to Inspire Practice.

Google Scholar

Kankaraš, M., Feron, E., and Renbarger, R. (2019). Assessing students’ social and emotional skills through triangulation of assessment methods (OECD Education Working Papers No. 208), OECD Education Working Papers. Paris: OECD Publishing.

Google Scholar

Kankaraš, M., and Suarez-Alvarez, J., (2019). Assessment framework of the OECD study on social and emotional skills (OECD Education Working Papers No. 207), OECD Education Working Papers. Paris: OECD Publishing

Google Scholar

Marôco, J. (2018). Análise Estatística com o SPSS Statistics, 7a edição. Pêro Pinheiro: ReportNumber, Lda.

Google Scholar

Martinez-Yarza, N., Santibáñez, R., and Solabarrieta, J. (2023). A systematic review of instruments measuring social and emotional skills in school-aged children and adolescents. Child Ind. Res. 16, 1475–1502. doi: 10.1007/s12187-023-10031-3

CrossRef Full Text | Google Scholar

Martinsone, B., Stokenberga, I., Damberga, I., Supe, I., Simões, C., Lebre, P., et al. (2022). Adolescent social emotional skills, resilience and behavioral problems during the COVID-19 pandemic: a longitudinal study in three European countries. Front. Psych. 13:942692. doi: 10.3389/fpsyt.2022.942692

CrossRef Full Text | Google Scholar

McCrae, R. R., and Costa, P. T.Jr. (1997). Personality trait structure as a human universal. Am. Psychol. 52, 509–516. doi: 10.1037/0003-066X.52.5.509

CrossRef Full Text | Google Scholar

Murano, D., Lipnevich, A. A., Walton, K. E., Burrus, J., Way, J. D., and Anguiano-Carrasco, C. (2021). Measuring social and emotional skills in elementary students: development of self-report Likert, situational judgment test, and forced choice items. Personal. Individ. Differ. 169:110012. doi: 10.1016/j.paid.2020.110012

CrossRef Full Text | Google Scholar

OECD (2021). Beyond academic learning: first results from the survey of social and emotional skills 2019. Paris: OECD Publishing

Google Scholar

Pasquili, L. (1999). Instrumentos psicológicos: Manual prático de elaboração. Brasília, Brasil: LabPAM/ IBAPP.

Google Scholar

Payton, J., Weissberg, R.P., Durlak, J.A., Dymnicki, A.B., Taylor, R.D., Schellinger, K.B., et al. (2008). The positive impact of social and emotional learning for kindergarten to eighth-grade students: findings from three scientific reviews. Chicago, IL: Collaborative for Academic, Social, and Emotional Learning.

Google Scholar

Pereira, A., and Patrício, T. (2008). SPSS: Guia prático de utilização. Análise de dados para ciências sociais e psicologia, 7.

Google Scholar

Martins, G. D. O., Gomes, C. A. S., Brocardo, J., Pedroso, J. V., Camilo, J. L. A., Silva, L. M. U., et al. (2017). Perfil dos alunos à saída da escolaridade obrigatória. Lisbon, Portugal: Ministério da Educação / Direção Geral da Educação. Available at: https://comum.rcaap.pt/bitstream/10400.26/22377/1/perfil_dos_alunos.pdf (Accessed June 3, 2023).

Google Scholar

Smith-Donald, R., Raver, C. C., Hayes, T., and Richardson, B. (2007). Preliminary construct and concurrent validity of the preschool self-regulation assessment (PSRA) for field-based research. Early Child. Res. Q. 22, 173–187. doi: 10.1016/j.ecresq.2007.01.002

CrossRef Full Text | Google Scholar

Tabachnick, B.G., and Fidell, L.S. (2007). Using multivariate statistics, using multivariate statistics, 5th ed. Allyn & Bacon/Pearson Education, Boston, MA.

Google Scholar

Tackett, J. L., Slobodskaya, H. R., Mar, R. A., Deal, J., Halverson, C. F.Jr, Baker, S. R., et al. (2012). The hierarchical structure of childhood personality in five countries: Continuity from early childhood to early adolescence. J. Pers. 80, 847–879.

Google Scholar

Tanner-Smith, E. E., Durlak, J. A., and Marx, R. A. (2018). Empirically based mean effect size distributions for universal prevention programs targeting school-aged youth: a review of meta-analyses. Prev. Sci. 19, 1091–1101. doi: 10.1007/s11121-018-0942-1

CrossRef Full Text | Google Scholar

Weissberg, R., Durlak, J., Domitrovich, C., and Gullotta, T. P. (2015). “Social and emotional learning: past, present and future” in Handbook for Social and Emotional Learning. eds. J. A. Durlak, C. E. Domitrovich, R. P. Weissberg, and T. P. Gullotta (New York: Guilford), 3–19.

Google Scholar

Keywords: adolescents, children, measures, social and emotional skills, social emotional learning, validation

Citation: Castro C, Barata C, Alexandre J and Colaço C (2023) Validation of a community-based application of the Portuguese version of the survey on Social and Emotional Skills – Child/Youth Form. Front. Psychol. 14:1214032. doi: 10.3389/fpsyg.2023.1214032

Received: 28 April 2023; Accepted: 04 August 2023;
Published: 21 August 2023.

Edited by:

Pedro Rosário, University of Minho, Portugal

Reviewed by:

Juliana Martins, University of Minho, Portugal
Guida Veiga, University of Évora, Portugal

Copyright © 2023 Castro, Barata, Alexandre and Colaço. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

*Correspondence: Catarina Castro, Y2FjY28xMUBpc2N0ZS1pdWwucHQ=

Disclaimer: All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article or claim that may be made by its manufacturer is not guaranteed or endorsed by the publisher.