Skip to main content

ORIGINAL RESEARCH article

Front. Lang. Sci., 27 November 2024
Sec. Reading

Design and validation of a Primary Reading Proficiency Test (PCL-P)

  • Department of Education, University of Deusto, Bilbao, Spain

Reading proficiency is one of the most important skills to be developed by students at school. However, the data on reading proficiency performance in Spanish schoolchildren show a downward trend. Thus, to measure this performance, a reading proficiency assessment test has been developed for Primary School students (PCL-P). Based on a multilevel model (executive, functional, instrumental, and epistemic), the test content is evaluated by 5 pedagogical and linguistic experts (inter-judge agreement: Fleiss' Kappa was 0.88). After that, it is applied to a sample of 216 students (6–12 years old) of the autonomous community of the Basque Country. Cronbach's alpha was 0.83. On the other hand, the exploratory factor analysis (principal components method and Varimax rotation) results in 5 factors explaining 59% of the variance: (1) word and text recognition and structural understanding; (2) evaluation and critical reflection of information for the creation of new ideas; (3) advanced/deep reading strategies; (4) phonological awareness; and (5) structural understanding of language and detection of key elements. In conclusion, the PCL-P is a valid instrument for the assessment of reading proficiency in Primary.

1 Introduction

Reading proficiency is considered one of the most important skills to be developed by students during their compulsory schooling, as it is essential not only for their academic development, but also for their future social and labor market integration (Hanushek, 2013; Díaz-Iso et al., 2022). This competence enables people to read in order to learn in any area of life and discipline of knowledge (Solé, 2004; Díaz-Iso et al., 2022). Learning to read and the development of reading skills are crucial in the educational context (Rosas et al., 2011; Hanushek, 2013; Guzmán-Simón et al., 2020) and, particularly, in the teaching-learning process, as they enable the acquisition of other learning (Rosas et al., 2011) and have direct effects on the rest of the curricular areas (Robledo et al., 2019).

Primary school students need to develop proficient reading proficiency to respond effectively to the demands of school (Jones et al., 2018; Díaz-Iso et al., 2022), as the foundation of school success lies in reading proficiency (Brigman et al., 2018; Díaz-Iso et al., 2022). Thus, when students have poor or late development of reading proficiency, they are likely to achieve poor academic results, which can lead, in the worst case, to failure and subsequent dropout (Guzmán-Simón et al., 2020). This means that interest in reading proficiency is a constant in educational and scientific circles (Robledo et al., 2019).

A review of the literature reveals how the definition of reading proficiency has evolved over the years. Traditionally, it was defined as the sum of decoding and listening practice (Hoover and Gough, 1990). However, in recent years, many researchers have reconfigured this definition, placing the focus on the ability to extract and construct meaning from the close reading of a written text (Snow, 2002; McKenna and Stahl, 2003; Cadime et al., 2013; Robledo et al., 2019). This competence includes cognitive and linguistic skills ranging from decoding to knowledge of words, grammar, linguistic and textual structures, as well as the integration of meaning with the reader's prior knowledge (Rosas et al., 2011; Cadime et al., 2013). Therefore, it involves understanding, using, and reflecting on written texts to develop knowledge (Morales and Espinosa, 2017). Thus, reading is a multidimensional thinking process that occurs in the interaction between the text, the reader, and the context (Rosas et al., 2011; Ballester, 2015). The reader plays an active role during this process, as he or she deploys a series of strategies to understand what is read (Cadime et al., 2013), for example, the generation of inferences (Rosas et al., 2011).

Several authors have constructed theoretical models to explain the different levels of reading proficiency that readers can reach (Freebody and Luke, 1990; Català et al., 2001; Winlund, 2021). Kintsch (1988) shows three, based on the mental representations that readers can make during the reading process: superficial, propositional, and situational. In the superficial, the reader decodes the words and phrases of the text, interpreting it literally. In the propositional, he understands the relationships between words to form ideas and grasps the logical and semantic meaning of sentences and paragraphs. In the situational, he integrates information from the text with prior knowledge to create a coherent and contextualized mental representation, making inferences or predicting consequences. For their part, Català et al. (2001), propose a taxonomy consisting of four basic levels: firstly, literal comprehension, related to the ideas and information that are explicitly given in the text. Secondly, reorganization, whereby the learner has to analyse, synthesize and/or organize the ideas or information explicitly given in the text. Next, inferential or interpretative comprehension refers to the student's use of explicit ideas and information, through which he/she makes conjectures and develops hypotheses. Finally, there is critical or judgmental comprehension, a level at which the student makes an evaluative judgement, comparing the ideas presented in the text with other criteria.

This study is based on the proposal of four levels of Freebody and Luke (1990). This model has been chosen from among all others because it has been widely implemented in educational research, offering a breakdown of literacy into specific, measurable metalinguistic skills. Alford and Jetnikoff (2016) applied this model in Australian secondary schools, finding that textual analysis exercises significantly improved students' abilities to interpret and critique texts from multiple perspectives. Preston (2018) further extended the model's application to rural educational settings, showing its adaptability for enhancing comprehension and critical engagement skills across diverse student populations. Rafi (2020) demonstrated the model's relevance in the digital age, using it to analyze digital media literacy among high school students in Indonesia, where it helped students critically assess online content. More recently, Winlund (2021) used this framework to explore the reading and writing practices of immigrant students in a Swedish language school, finding that students engaged in literacy practices differently based on individual challenges and assets. Likewise, Tshering (2023) examined the model's use by an ESL teacher in Bhutan, highlighting its effectiveness in developing decoding, meaning-making, linguistic analysis, and critical reading skills among grade-9 students.

Building on the evidence of its successful applications, this study delves into the specific levels defined by Freebody and Luke (1990) to deepen our understanding of how each dimension contributes to reading competency. Firstly, the executive level is defined as the basic ability to understand and use the written code, and includes four metalinguistic skills. (1) Letter and word recognition: involve identifying and differentiating letters of the alphabet and words as a whole. (2) Phonological awareness: refers to the ability to recognize and manipulate the sounds of spoken language (e.g., syllables). (3) Phonemic awareness: the ability to recognize and manipulate individual sounds or phonemes within words. (4) Morphological awareness: involves recognizing and understanding the structural components of words (e.g., roots, prefixes, and suffixes) in order to break them down into parts to better understand their meaning. All of these metalinguistic skills enable readers to decode the literal meaning of words and texts effectively and accurately (Freebody and Luke, 1990).

Secondly, the functional level refers to the ability to apply reading skills to face and solve the challenges and tasks of everyday life. It contains five metalinguistic skills. (1) Basic understanding of the grammatical structure of language: it allows understanding how words are organized in sentences and how grammatical rules influence the meaning of a text. (2) The identification of key words in instructions and simple texts in order to grasp the essential information. (3) The use of context to infer the meaning of unknown words, enriching the overall understanding of the text. (4) Understanding basic vocabulary related to everyday life, that is, common terms and expressions in daily situations. (5) The identification of the logical sequence in simple instructions and procedures. At this level, readers can understand basic texts and correctly interpret the information presented in them (Freebody and Luke, 1990).

Thirdly, the instrumental level is the ability to search, analyze and use information from a text in a complex way to access knowledge. It encompasses five advanced metalinguistic skills. (1) The understanding of the structure and organization of different types of texts (e.g., narrative, expository, and argumentative). (2) The making of inferences of meaning from more complex contexts, that is, deducing implicit information from a text using contextual clues to fill in information gaps and understand underlying messages. (3) The recognition of the structure and function of different text elements (e.g., headings, subheadings, and paragraphs) to understand how they contribute to the organization of content. (4) Critical evaluation of the reliability and relevance of information to discern whether it contains reliable or unreliable information. (5) The use of active reading strategies (e.g., underlining, note-taking, and summarizing) to more comprehensively understand the text or to synthesize and retain information effectively. At this level, readers are able to comprehend more long and complex texts, such as newspaper articles, in greater depth and extract meaningful information to make decisions or solve problems (Freebody and Luke, 1990).

Finally, the epistemic level is the ability to critically reflect on the information provided by a text, contrast different perspectives and use knowledge effectively and creatively. It includes five metalinguistic skills. (1) The critical analysis of arguments and points of view presented in complex texts to evaluate their coherence and identify their conclusions. (2) The evaluation of the validity of evidence and arguments presented in academic and scientific texts, detecting possible fallacies. (3) The identification of biases and manipulations in the presentation of information. (4) Critical reflection on the relationship between the information presented and previous knowledge in order to modify or expand the latter. (5) The generation of new ideas and perspectives from the analysis and synthesis of information (e.g., formulating new hypotheses and creating new concepts). At this level, the reader can deeply understand complex texts and interact with them critically to think abstractly and creatively about knowledge (Freebody and Luke, 1990).

In order to determine whether Spanish students are effectively developing reading proficiency, Spain participates in several international assessment studies on a regular basis. The best known are PISA and PIRLS (Pascual et al., 2021). PISA assessments are grounded in a socio-constructivist framework, evaluating students' ability to apply their reading skills to real-life situations and interpret texts critically and reflectively (Organisation for Economic Co-operation Development, 2019). Similarly, PIRLS follows a cognitive-processing approach, assessing skills like retrieving information, making inferences, and interpreting narrative and informational texts (Mullis and Martin, 2021).

According to the latest PISA results, Spanish children have shown a downward trend in reading achievement since 2015. In 2022, Spain scores 474 points, 22 points lower than in 2015, and the Basque Country also suffers a decline, 32 points lower (Ministerio de Educación, Formación Profesional y Deportes, 2023). Similarly, PIRLS reveals that the average reading proficiency score for Spain is 521, i.e., on average students are at an intermediate level; 5% are at the very low level, 20% at the low level, 40% at the intermediate level, 30% at the high level and 6% at the advanced level. PIRLS confirms that the average performance has decreased from 2016, with 528 points, to 2021, with 521 points (Ministerio de Educación y Formación Profesional, 2023). All of these data suggest that students' reading proficiency is declining year by year.

These results indicate that some students have serious difficulties in understanding the texts they read, while many reaches only a superficial and fragmented level of comprehension (Graham, 2020; Gil, 2022), which implies that their reading proficiency is lower than expected for their age (Ammermüller, 2012). Students with poor reading proficiency may have several sources of difficulty. To name a few, problems with decoding, with the ability to retrieve the meaning of words or sentences, or with the ability to integrate the meanings of sentences and construct a situational model of the text (Camarillo et al., 2021; Pascual et al., 2021; Díaz-Iso et al., 2022).

Considering all that has been mentioned so far, it is considered necessary to carry out an early detection of low reading proficiency in primary school students, which also makes it possible to identify the reading difficulties. For that, there are several tests or assessment tests (Pascual et al., 2021). In Spain, tests such as the PROLEC-R have been designed to assess letter identification, word recognition, and syntactic and semantic processes (Cuetos et al., 2007); and the ACL (Català et al., 2001) and EDICOLE (Gómez-Veiga et al., 2020) to assess reading comprehension.

However, these tests are rarely used in schools, as teachers tend to prioritize continuous student assessment over external examination (Pascual et al., 2021). In addition, there are two problems with these tests. First, some are not well grounded in a particular theatrical model or are not based on the most recent research results on reading proficiency. Second, others have not been adequately validated, so they cannot guarantee that they adequately represent the construct they are intended to measure or that the scores obtained with them are accurate, which limits the possibilities of identifying students who perform below their reference group (Cadime et al., 2013; Pascual et al., 2021). Because of all this, it was decided to create a new reading proficiency test based on a solid theoretical model, that of Freebody and Luke (1990). Thus, this research aims to validate a reading proficiency assessment test aimed at students from 1st to 6th grade of Primary Education, called the Primary Reading Proficiency Test - Primary (PCL-P).

2 Materials and methods

2.1 Context

The study was conducted at a state-subsidized school located in a municipality in the province of Bizkaia, within the autonomous community of the Basque Country, Spain. This school caters to students from infant school through to primary education and follows the linguistic Model B. In this Model, the instruction is delivered in three vehicular languages: 49% in Basque, 29% in Spanish, and 22% in English. For example, mathematic is taught in Spanish, natural sciences in English and social sciences in Basque.

The institution is situated in a region known for its rich cultural heritage and strong emphasis on bilingual education. The school's curriculum is designed to promote proficiency in all three languages, preparing students for a multicultural and multilingual environment. The student population at this school is diverse, predominantly comprising children from middle socio-economic backgrounds. A significant portion of the students are from migrant families, primarily from Morocco and Latin America, adding to the cultural and linguistic diversity of the school.

The school also provides various support programs to assist students from migrant backgrounds in adapting to the trilingual educational system and achieving academic success. Additionally, the educational approach at the school emphasizes early literacy, with reading skills introduced in infant school and solidified during the primary years. This structured approach to literacy aims to ensure that students develop strong reading abilities.

2.2 Participants

Participants were selected using non-probabilistic purposive or convenience sampling. The sample consisted of 216 primary school students, ranging in age from 6 to 12 years. In Spain, primary education consists of 6 grades, classified in three cycles: 1st cycle (1st and 2nd courses), 2nd cycle (3rd and 4th courses), and 3rd cycle (5th and 6th courses). The breakdown of the sample size for each cycle, course, and class is presented in Table 1.

Table 1
www.frontiersin.org

Table 1. Sample size by cycle, course, and class.

The participants represent a cross-section of the school's diverse student body. The selection process aimed to include students from different socio-economic backgrounds and cultural origins to ensure a comprehensive representation of the school's demographic profile. The majority of participants come from middle-class families, with a notable proportion of students being children of immigrants. This diversity within the sample provides valuable insights into the educational experiences of students in a multilingual and multicultural setting.

2.3 Instrument

The PCL-P is a test that aims to assess the reading proficiency of primary school students. This instrument has been designed in Spanish, as it is the mother language of most of the students in the sample, but it has been planned to be applied in a multilingual context (Basque, English and Spanish). In addition, it has been constructed based on Freebody and Luke's (1990) four-level model of reading proficiency.

Three versions of the test have been designed; one for each school cycle. As reading proficiency evolves significantly throughout primary education, by making a version for each cycle, the differences in reading skills according to the stage of development are recognized, making it possible to identify specific changes and progress that would not be visible with a single test.

In the construction of the instrument, the decision was made to measure 15 out of the original 20 metacognitive skills for reading proficiency. This decision was grounded in a review of the context and relevance of each metacognitive skill across cycles considered in the study. Firstly, it is important to note that some skills were not included due to their variable relevance among educational cycles. It was identified that certain metacognitive skills were not uniformly applicable across all cycles due to differences in cognitive development and specific curricular objectives at each level. This led to the prioritization of those skills that provided a more comprehensive and representative perspective of reading competence in all evaluated contexts. Moreover, it is recognized that some metacognitive skills might appear omitted due to their implicit nature within more complex skills. For instance, fundamental abilities like letter recognition are inherently included within more developed skills like word recognition. This approach allowed the focus to be on measurements that capture multiple facets of reading competence without unnecessarily segmenting the cognitive process involved. This conceptual overlap was deliberately utilized to simplify the instrument, enhance its efficiency, and avoid redundancies that could impact the precision and clarity of the measurements.

It should be clarified that for each skill, an activity was designed. Therefore, the test is made up of multiple types of activities, for example, write correctly wrongly spoken words, join together prefixes and suffixes, read a text and answer questions, order the paragraphs of a text, identify the parts of a text and the main ideas, etc. Additionally, it should be notices that although initially each item had a different score, for the statistical analysis of the data, all were standardized and measured on a scale from 0 to 1. Table 2 shows which skills are included at each level, together with the activity designed to measure each skill, according to the school year.

Table 2
www.frontiersin.org

Table 2. Metalinguistic skills and their activities.

2.4 Procedure

After the school requested help in assessing the reading proficiency of primary school students, the researchers applied to the Ethics Committee of [blinded for peer review] for permission to conduct the research. With the ethical approval, the families (parents and legal guardians of the students) were informed about the study and asked to sign an informed consent form before administering the PCL-P, thus ensuring their understanding and acceptance of their children's participation in the research.

Prior to the administration, a pilot test was conducted with a small group of students. This preliminary phase was intended to identify and troubleshoot potential problems in the procedure, ensuring that the test was adequate and effective for the intended purpose.

Once the pilot test was completed and the necessary adjustments were made, for example, the rewriting of certain text or the length of some activities, the PCL-P was administered in the classrooms of the school. This evaluation was conducted in a single 55-min session during the second quarter of the 2023–2024 academic year. Although it can be difficult for primary students to maintain attention for so long, the instrument was administered in this way to attend to the request of the school. The assessments were carried out by education professionals, who were previously trained in the application of the instrument. Students answered the test individually. The implementation was carried out with the utmost care to maintain a calm and distraction-free environment, allowing students to fully concentrate on the task.

2.5 Data analysis

The analyses were carried out with the statistical package SPSS version 28. On the one hand, univariate descriptive statistics were calculated for each of the reading proficiency metalinguistic skills, and for each cycle, course, and class. Thus, the minimum score, maximum score, mean, mode, and standard deviation were obtained. On the other hand, the internal consistency reliability of the PCL-P was estimated by calculating Cronbach's Alpha index for each school year. Content validity analyses were also carried out, for which an expert judgment was made and inter-rater reliability was calculated using Fleiss' Kappa. Finally, construct validity analyses have been carried out, for which an Exploratory Factor Analysis has been performed. This allows us to analyze the structural dimensionality of the instrument. It is worth mentioning that a Confirmatory Factor Analysis was not performed due to the small sample size used in the research.

3 Results

3.1 Descriptive analysis

The mean and standard deviation obtained in each metalinguistic skill evaluated for each cycle are shown in Table 3.

Table 3
www.frontiersin.org

Table 3. Descriptive analysis.

3.2 Internal consistency reliability

The Cronbach's Alpha coefficient of the PCL-P for the 1st cycle is 0.83, 0.74 for the 2nd cycle and 0.72 for the 3rd cycle. All three indices are above 0.70, indicating that the instrument is reliable (Field, 2013). It should be clarified that this coefficient has been calculated for each cycle because, although the three versions of the instrument measure the same metalinguistic skills, the activities that evaluate them are occasionally different. Therefore, calculating Cronbach's Alpha for each cycle separately ensures a more accurate assessment of the internal consistency and reliability for each specific set of activities tailored to the developmental stage of the students.

3.3 Content validity analysis

To ensure the content validity of the PCL-P, an expert judgment has been carried out. These professionals have evaluated both the activities that make up the test and the theoretical construct on which it is based, i.e. the four levels and their metalinguistic skills. There are five national and international experts with more than 10 years of experience in the fields of linguistics and pedagogy, which gives a necessary value to this type of judgment (Gwet, 2014). All of them have agreed to participate voluntarily in the assessment of the instrument.

The expert judges, in addition to making any comments or observations they considered appropriate, individually rated as high, medium or low the correspondence between the theatrical model and the content of the test, the agreement and relevance of each of the metalinguistic skills with its corresponding level (dimension), and the clarity and appropriateness of each activity to measure the metalinguistic skill it claims to measure.

Once these scores were collected, the Fleiss Kappa coefficient was calculated. This is used to assess the level of agreement between 3 or more raters who independently judge measurement criteria, using an instrument that includes several ordinal variables. In this case, the coefficient obtains a value of 0.879, which indicates that the strength of agreement between raters is considerable or substantial (Landis and Koch, 1977).

In addition, the researchers conducted a review of the PCL-P and introduced minor changes, considering the recommendations offered by the experts. Most of the changes were aimed at making the texts more understandable.

3.4 Exploratory factor analysis

An Exploratory Factor Analysis has been carried out to study the internal structure of the PCL-P. Although the instrument has been designed considering a specific theoretical model on reading proficiency, it is necessary to explore whether this test fits the structure of the model or suggests a new one. This analysis has been done with the total sample of students, not by cycles, due to the limited sample size.

To determine whether it is appropriate to carry out the Exploratory Factor Analysis, the KMO index is applied, which presents a value of 0.74 (higher than 0.60 and close to 1), and Bartlett's test of sphericity, which has a result of <0.001 (see Table 4). This allows to reject the null hypothesis and conclude that the metalinguistic skills measured in the PCL-P are sufficiently correlated with each other to perform the analysis (López-Aguado and Gutiérrez-Provecho, 2019).

Table 4
www.frontiersin.org

Table 4. KMO and Bartlett's test of sphericity.

The determinant of the correlation matrix has a value of 0.066 (very close to 0), which allows to confirm that there is sufficient correlation between the metalinguistic skills so that they can form factors.

For the extraction of factors, Exploratory Factor Analysis was carried out using the principal component method and orthogonal rotation of the Varimax type. Factors were extracted based on eigenvalues >1 in the analyzed matrix, resulting in a factorial solution that includes 5 differentiated factors explaining 59% of the variance (see Table 5).

Table 5
www.frontiersin.org

Table 5. Total variance explained by each of the identified components.

When examining the commonalities table, no skill had a low value; therefore, none were excluded from the analysis. The inflection point of the scree plot also indicated that five factors comprise the instrument. The factor loadings derived from the analysis are included in Table 6. Factor loadings >0.400 were considered significant.

Table 6
www.frontiersin.org

Table 6. Rotated component matrix.

Thus, the analysis ultimately groups the 15 metalinguistic skills into these five factors as follows:

3.4.1 Factor 1. Word and text recognition and structural understanding

This factor includes skills related to word recognition and the structural understanding of texts. The interpretation of this factor suggests that successful reading comprehension begins with the ability to recognize words accurately and understand the organization of different text types. These foundational skills are crucial for building more complex comprehension abilities. The metalinguistic skills in this factor are word recognition, morphological awareness, identifying the logical sequence of texts, and understanding the structure of different types of texts.

3.4.2 Factor 2. Evaluation and critical reflection of information for the creation of new ideas

This factor focuses on the evaluation of information and the critical reflection needed to generate new ideas. This factor is interpreted as emphasizing higher-order thinking skills. Students must not only understand the content but also evaluate its relevance and reliability, relate it to their prior knowledge, and use it to generate new insights or creative solutions. This reflects a deeper level of engagement with the text. The metalinguistic skills in this factor are using context to infer the meaning of unfamiliar words, critical evaluation of reliability and relevance of information, reflecting critically on the relationship between information and prior knowledge, and generating new ideas from analysis of information.

3.4.3 Factor 3. Advanced/deep reading strategies

This factor includes strategies for deep and advanced reading comprehension. These strategies are essential for understanding complex texts and arguments. By recognizing different elements of the text and employing active reading strategies, students can critically analyze the material, leading to a more profound comprehension and the ability to engage thoughtfully with the text. The metalinguistic skills in this factor are recognition of different elements of the text, use of active reading strategies, and critical analysis of arguments presented in texts.

3.4.4 Factor 4. Phonological awareness

This factor pertains to the awareness of the sound structures of language, highlighting the importance of phonological skills as a foundational component of reading. Strong phonological awareness supports decoding and word recognition, which are critical for fluent reading. This factor underscores the basic auditory skills necessary for developing reading proficiency. The metalinguistic skills in this factor are phonological awareness and phonemic awareness.

3.4.5 Factor 5. Structural understanding of language and detection of key elements

This factor involves understanding the grammatical structure of language and identifying key elements in texts. This factor is interpreted as focusing on the ability to parse and understand the syntactical and grammatical features of language. This understanding allows students to identify key words and phrases that are essential for grasping the main ideas and details of a text, contributing to overall reading comprehension. The metalinguistic skills in this factor are understanding of the grammatical structure of language and identification of key words in texts. Following this, Figure 1 illustrates this grouping.

Figure 1
www.frontiersin.org

Figure 1. Grouping of metalinguistic skills into the identified factors.

4 Discussion

The aim of this research has been to design and validate a reading proficiency assessment test for primary school students, the PCL-P. The results presented above provide empirical evidence for the validity of the instrument, as the internal consistency analyses performed present adequate reliability indicators, and the content validity analyses show significant agreement among the expert judges.

Furthermore, the findings presented suggest that the PCL-P has an appropriate internal structure. Although the instrument is designed to measure 15 metalinguistic skills originally classified into four levels: executive, functional, instrumental and epistemic (Freebody and Luke, 1990), the factor loadings obtained in the exploratory factor analysis indicate that there are five dimensions of the test: (1) word and text recognition and structural understanding; (2) evaluation and critical reflection of information for the creation of new ideas; (3) advanced reading strategies; (4) phonological awareness; and (5) structural understanding of language and detection of key elements. These new factors do not align perfectly one-to-one with the levels in Freebody and Luke's model (1990), but there is a clear correspondence that can be interpreted.

Factor 1, which includes skills such as word recognition, morphological awareness, identifying the logical sequence of texts, and understanding the structure of different types of texts, aligns with both the Executive and Instrumental levels. These skills encompass basic metalinguistic abilities as well as more advanced text comprehension skills.

Factor 2 corresponds closely with the Epistemic level, as it includes the use of context to infer the meaning of unfamiliar words, the critical evaluation of the reliability and relevance of information, reflecting critically on the relationship between information and prior knowledge, and generating new ideas from the analysis of information. These skills are essential for higher-order thinking and deep comprehension.

Factor 3 relates to both the Instrumental and Epistemic levels, as it involves the recognition of different elements of the text, the use of active reading strategies, and the critical analysis of arguments presented in texts, which are necessary for a deep understanding and critical engagement with texts.

Factor 4 is aligned with the Executive level, encompassing phonological awareness and phonemic awareness, which are foundational skills in early reading development.

Finally, Factor 5 corresponds to the Functional level, including understanding the grammatical structure of language and the identification of key words in texts, which are crucial for basic comprehension and text processing. This correspondence supports the validity of our instrument in assessing a wide range of metalinguistic skills across different dimensions of reading comprehension.

Additionally, this new taxonomy seems to make theoretical sense: reading proficiency begins with the ability to recognize words and understand the structure and sequence of texts (dimension 1), but is complemented by the ability to analyze the sound and grammatical structures of language (dimensions 4 and 5) and to employ active reading strategies (dimension 3), which finally allows for reflection on the information read in order to generate new knowledge (dimension 2).

Previous research validating other instruments for assessing reading proficiency in primary school students also proposes various taxonomies. For example, Rosas et al. (2011) design a test composed of two dimensions: fluency and comprehension (reading comprehension, fluency and accuracy, phonological awareness, and oral comprehension), and alphabetic knowledge (text types and alphabetic principle). Similarly, Cadavid-Ruiz et al. (2016) propose an instrument consisting of two factors: (a) alphabetic principle, phonological awareness, oral comprehension, fluency and accuracy, and reading comprehension; and (b) text types and naming speed. However, two studies validating tests to specifically assess reading comprehension categorize it into a single dimension (Cadime et al., 2013) or three: short-text comprehension, long-text comprehension, and knowledge-based inferences (Pascual et al., 2021). This discrepancy in terms of the factors that make up reading proficiency warns that it is a multifaceted construct composed of numerous and diverse skills, which can be grouped in different ways. Therefore, it is important to continue conducting research on reading proficiency, particularly on its diagnosis, because it is the first step to improve it.

Thus, the results presented in these pages lead to the conclusion that the PCL-P is a valid instrument for the assessment of the reading proficiency of students in Primary 1–6. However, the fact that the sample was small and homogeneous, which corresponds to the main limitation of the study, makes it necessary to replicate the application of the test in a larger and more diverse sample. In other words, as a future line of research, it is proposed to study the psychometric properties of the PCL-P in samples of students from other autonomous communities in Spain or from other Spanish-speaking countries. In addition, it would be interesting to analyze whether after applying the PCL-P for the first time (pretest) and carrying out pedagogical interventions to reinforce the dimensions of competence detected as insufficient, these improve in the post-test.

Data availability statement

The datasets presented in this article are not readily available because no se han hecho públicos. Requests to access the datasets should be directed to Laura Trimiño, bGF1cmEudHJpbWluby5wZXJleiYjeDAwMDQwO2RldXN0by5lcw==.

Ethics statement

The studies involving humans were approved by Comité de ética de la Universidad de Deusto. The studies were conducted in accordance with the local legislation and institutional requirements. Written informed consent for participation in this study was provided by the participants' legal guardians/next of kin.

Author contributions

LT-P: Conceptualization, Data curation, Formal analysis, Investigation, Methodology, Software, Validation, Writing – original draft, Writing – review & editing. JH-R: Conceptualization, Formal analysis, Investigation, Methodology, Software, Validation, Writing – original draft, Writing – review & editing. EV: Conceptualization, Investigation, Project administration, Supervision, Validation, Writing – original draft, Writing – review & editing. AM: Data curation, Methodology, Software, Validation, Writing – original draft, Writing – review & editing.

Funding

The author(s) declare that no financial support was received for the research, authorship, and/or publication of this article.

Conflict of interest

The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

Publisher's note

All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article, or claim that may be made by its manufacturer, is not guaranteed or endorsed by the publisher.

References

Alford, J., and Jetnikoff, A. (2016). A critical literacy approach to teaching textual analysis in the Australian secondary English classroom. Aust. J. Lang. Literacy 39, 202–213. doi: 10.1007/BF03651965

Crossref Full Text | Google Scholar

Ammermüller, A. (2012). Institutional features of schooling systems and educational inequality: cross-country evidence from PIRLS and PISA. German Econ. Rev. 14, 190–213. doi: 10.1111/j.1468-0475.2012.00565.x

Crossref Full Text | Google Scholar

Ballester, J. (2015). La formación lectora y literaria. Barcelona: Graó.

Google Scholar

Brigman, G., Webb, L., and Campbell, C. (2018). Building skills for school success: Improving the academic and social competence of students. Prof. School Counsel. 10, 279–288. doi: 10.5330/prsc.10.3.v850256191627227

Crossref Full Text | Google Scholar

Cadavid-Ruiz, N., Quijano-Martínez, M., Escobar, P., Rosas, R., and Tenorio, M. (2016). Validación de una prueba computarizada de lectura inicial en niños escolares colombianos. Ocnos: Revista de Estudios sobre Lectura 15, 98–109. doi: 10.18239/ocnos_2016.15.2.1121

Crossref Full Text | Google Scholar

Cadime, I., Ribeiro, I., Viana, F. L., Santos, S., Prieto, G., and Maia, J. (2013). Validity of a reading comprehension test for portuguese students. Psicothema 25, 384–389. doi: 10.7334/psicothema2012.288

PubMed Abstract | Crossref Full Text | Google Scholar

Camarillo, B., Silva, G., and Romero, S. (2021). El Modelo Simple de Lectura en la identificación de dificultades lectoras en educación primaria. Estudios pedagógicos 47, 343–357. doi: 10.4067/S0718-07052021000300343

PubMed Abstract | Crossref Full Text | Google Scholar

Català, G., Català, M., Molina, E., and Monclús, R. (2001). Evaluación de la comprensión lectora. In: Pruebas ACL. Barcelona: Graó.

Google Scholar

Cuetos, F., Rodríguez, B., Ruano, E., and Arribas, D. (2007). PROLEC-R. Evaluación de los procesos lectores-revisado. Barcelona: TEA.

PubMed Abstract | Google Scholar

Díaz-Iso, A., Velasco, E., and Meza, P. (2022). Intervenciones realizadas para mejorar la competencia lectora: una revisión sistemática. Revista de Educación 398, 249–281. doi: 10.4438/1988-592X-RE-2022-398-559

Crossref Full Text | Google Scholar

Field, A. P. (2013). Discovering statistics using IBM SPSS statistics: And sex, drugs and rock n roll. Los Angeles: Sage.

Google Scholar

Freebody, P., and Luke, A. (1990). Literacies programs: Debates and demands in cultural context. Prospect 5, 7–16.

Google Scholar

Gil, J. M. (2022). La enseñanza de la lengua a partir de los textos literarios como posible vía de solución a la falta de competencia lectora. Revista de Educación y Pensamiento 27, 30–36.

Google Scholar

Gómez-Veiga, I., García-Madruga, J. A., Pérez-Hernández, E., Orjales, I., López-Escribano, C., Duque, G., et al. (2020). EDICOLE. Evaluación diagnóstica de la comprensión lectora. Barcelona: TEA.

Google Scholar

Graham, S. (2020). The sciences of reading and writing must become more fully integrated. Read. Res. Q. 55, 35–44. doi: 10.1002/rrq.332

Crossref Full Text | Google Scholar

Guzmán-Simón, F., Gil-Flores, J., and Pacheco-Costa, A. (2020). Home literacy environment and reading comprehension in spanish primary education. J. Res. Read. 43, 229–247. doi: 10.1111/1467-9817.12299

Crossref Full Text | Google Scholar

Gwet, K. L. (2014). Handbook of Inter-Rater Reliability: The Definitive Guide to Measuring the Extent of Agreement Among Raters (4th ed.). Gaithersburg (EEUU): Advanced Analytics, LLC.

Google Scholar

Hanushek, E. A. (2013). Economic growth in developing countries: the role of human capital. Econ. Educ. Rev. 37, 204–212. doi: 10.1016/j.econedurev.2013.04.005

Crossref Full Text | Google Scholar

Hoover, W. A., and Gough, P. B. (1990). The simple view of reading. Read. Writ. 2, 127–160. doi: 10.1007/BF00401799

Crossref Full Text | Google Scholar

Jones, E., Larsen, R., Sudweeks, R., Richard Young, K., and Gibb, G. (2018). Evaluating paraeducator-led reading interventions in elementary school: a multi-cutoff regression-discontinuity analysis. J. Res. Educ. Eff. 11, 507–534. doi: 10.1080/19345747.2018.1481164

Crossref Full Text | Google Scholar

Kintsch, W. (1988). The role of knowledge discourse comprehension: a construction-integration model. Psychol. Rev. 95, 163–182. doi: 10.1037/0033-295X.95.2.163

PubMed Abstract | Crossref Full Text | Google Scholar

Landis, J. R., and Koch, G. G. (1977). The measurement of observer agreement for categorical data. Biometrics 33, 159–174. doi: 10.2307/2529310

Crossref Full Text | Google Scholar

López-Aguado, M., and Gutiérrez-Provecho, L. (2019). Cómo realizar e interpretar un análisis factorial exploratorio utilizando SPSS. REIRE Revista d Innovació i Recerca en Educació 12, 1–14. doi: 10.1344/reire2019.12.227057

Crossref Full Text | Google Scholar

McKenna, M., and Stahl, S. (2003). Assessment for Reading Instruction. New York: The Gulford Press.

Google Scholar

Ministerio de Educación y Formación Profesional (2023). “PIRLS 2021. Estudio internacional de progreso en competencia lectora,” in Informe español. Secretaría general técnica. Available at: https://acortar.link/L6Q2Wn

Google Scholar

Ministerio de Educación, Formación Profesional y Deportes. (2023). “PISA 2022,” in Programa para la evaluación internacional de los estudiantes. Informe español. Secretaría general técnica. Available at: https://acortar.link/afkH6B

Google Scholar

Morales, A. F., and Espinosa, E. B. (2017). Evaluación de la competencia lectora de futuros docentes. Investigaciones sobre lectura 7, 7–21. doi: 10.37132/isl.v0i7.177

Crossref Full Text | Google Scholar

Mullis, I. V. S., and Martin, M. O. (2021). PIRLS 2021 Assessment Frameworks. Amsterdam: International Association for the Evaluation of Educational Achievement (IEA).

Google Scholar

Organisation for Economic Co-operation and Development (2019). PISA 2018 Assessment and Analytical Framework. Paris: OECD Publishing.

PubMed Abstract | Google Scholar

Pascual, G., Goikoetxea, E., and Bustos, H. (2021). Propiedades psicométricas de un test de comprensión lectora para alumnos de educación primaria. Psykhe 30, 1–15. doi: 10.7764/psykhe.2018.22337

Crossref Full Text | Google Scholar

Preston, S. (2018). Literacy practices in rural secondary schools: application of Freebody and Luke's model in diverse settings. J. Adolesc. Adult Literacy 62, 205–214.

Google Scholar

Rafi, M. (2020). Digital media literacy through Freebody and Luke's four resources model: High school case study in Indonesia. J. Media Literacy Educ. 12, 52–64.

Google Scholar

Robledo, P., Fidalgo, R., and Méndez, M. (2019). Evaluación de la comprensión lectora a partir del análisis de la práctica del profesorado y la interacción docente-estudiante. Revista de Educación 384, 97–120. doi: 10.4438/1988-592X-RE-2019-384-414

Crossref Full Text | Google Scholar

Rosas, R., Medina, L., Meneses, A., Guajardo, A., Cuchacovich, S., and Escobar, P. (2011). Construcción y validación de una prueba de evaluación de competencia lectora inicial basada en computador. Pensamiento Educativo. Revista de Investigación Educacional Latinoamericana 48, 43–61. doi: 10.7764/PEL.48.1.2011.4

Crossref Full Text | Google Scholar

Snow, C. (2002). Reading for Understanding: Toward an R&D Program in Reading Comprehension. Santa Mónica (EEUU): RAND Reading Study Group. Available at: https://acortar.link/dNftr4 (accessed July, 2024).

Google Scholar

Solé, I. (2004). “Proyectos y programas de innovación en la enseñanza y el aprendizaje de la lectura y la escritura,” in La práctica psicopedagógica en educación formal, eds. C. En, N. Monereo, Y. Mollá (Barcelona: UOC), 253–274.

Google Scholar

Tshering, U. (2023). A model for teaching critical reading in an ESL curriculum. English Stud. 9, 271–292. doi: 10.33919/esnbu.23.2.7

Crossref Full Text | Google Scholar

Winlund, A. (2021). Writing practices of recently immigrated adolescent emergent writers: A study from a language introductory school in Sweden. J. Second Lang. Writ. 51:100794. doi: 10.1016/j.jslw.2021.100794

Crossref Full Text | Google Scholar

Keywords: reading test, validation, reading proficiency, student assessment, primary education

Citation: Trimiño-Pérez L, Hurtado-Reina J, Velasco E and Martinez A (2024) Design and validation of a Primary Reading Proficiency Test (PCL-P). Front. Lang. Sci. 3:1471040. doi: 10.3389/flang.2024.1471040

Received: 26 July 2024; Accepted: 07 November 2024;
Published: 27 November 2024.

Edited by:

Katherine Susan Binder, Mount Holyoke College, United States

Reviewed by:

Carola Alvarado, Universidad Santo Tomás, Chile
Debora Maria Befi-Lopes, University of São Paulo, Brazil

Copyright © 2024 Trimiño-Pérez, Hurtado-Reina, Velasco and Martinez. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

*Correspondence: Laura Trimiño-Pérez, bGF1cmEudHJpbWluby5wZXJleiYjeDAwMDQwO2RldXN0by5lcw==

Disclaimer: All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article or claim that may be made by its manufacturer is not guaranteed or endorsed by the publisher.