Skip to main content

CURRICULUM, INSTRUCTION, AND PEDAGOGY article

Front. Educ., 06 June 2024
Sec. STEM Education
This article is part of the Research Topic Organic Chemistry Education Research into Practice View all 10 articles

Development of a metacognition co-curriculum for a university course in introductory organic chemistry

Stephen L. MacNeil\n\n
\nStephen L. MacNeil1 * Eileen Wood\nEileen Wood2 Fatma Arslantas\nFatma Arslantas2
  • 1Department of Chemistry & Biochemistry, Wilfrid Laurier University, Waterloo, ON, Canada
  • 2Department of Psychology, Wilfrid Laurier University, Waterloo, ON, Canada

Metacognition is a fundamental skill that allows advanced learners to adapt to diverse learning environments. Metacognition, however, can be domain specific and students may fail to generalize metacognitive skills across domains. Thus, students in higher education may require specific training to acquire relevant metacognitive skills in differing domains or may need cueing to engage their metacognitive skills and knowledge in new domains. The present report describes the development of a co-curricular metacognitive program for chemistry students and suggests how this program could be adopted by other chemistry courses or adapted for other domains in higher education. Several supports were introduced in this program including self-assessment of competence with learning task inventories (LTIs; i.e., detailed lists of learning tasks), self-assessments of confidence regarding in-class content questions, and performance predictions and postdictions on tests. In general, exposure to these supports resulted in overall performance and confidence gains. However, individual differences were evident with some students demonstrating greater learning gains than others. Initial Dunning-Kruger effects associated with pre-and postdictions, with low-performing students overestimating grades and high-performing students underestimating grades, decreased over exposure. A summary of the evolution of this metacognitive co-curricular program, the educational literature that steered it, and the differential impact on students is explained.

Introduction

Metacognition

Metacognition is a multifaceted, fundamental skill that allows advanced learners to adapt to diverse learning environments effectively and efficiently. Metacognition is the mechanism through which adult learners control their cognitive processes through planning, monitoring, evaluating, and regulating their own learning (Flavell, 1979; Dimmit and McCormick, 2012; Rivas et al., 2022). In addition to cognitive control, three key components of metacognitive knowledge include declarative knowledge (i.e., awareness of what you do and do not know in a particular domain); procedural knowledge (i.e., a repertoire of strategies to utilize for different learning demands), and conditional knowledge (i.e., the ability to use available knowledge, strategies, and other tools when needed; Schraw et al., 2006). Successful learning is evidenced when metacognitive skills are used to identify, map approaches to tackle, and monitor the process and outcomes of a learning task, and, at steps throughout the process, engage in self-reflection and self-assessment regarding decisions made, progress occurring, and outcomes achieved. Typically, more successful students are better able to calibrate their learning and performance (Dimmit and McCormick, 2012; Saks et al., 2021).

In today’s higher education classrooms, instructional approaches nested in student-centered learning, constructivist pedagogies, and flexed/hybrid formats draw heavily on students’ metacognitive skills to drive their own learning. Students must regularly assess what is new, unknown, or poorly understood to regulate their reading, studying, and preparation within these contexts. Although advanced learners in higher education contexts typically have acquired generalized metacognitive skills that prepare them to approach a broad range of new learning tasks (Geurten et al., 2018), learners may fail to draw upon these skills. More precisely, in some circumstances, application of metacognitive skills differs across domains. This domain specificity means that students are less able or likely to apply their metacognitive skills across domains of study, especially when the domain or tasks are perceived to be more difficult (Scott and Berman, 2013). In addition, some advanced learners may not have the scope of strategic repertoires to engage as independently as expected in today’s classrooms (García-Pérez et al., 2021). As a result, students may need explicit instruction or prompts to scaffold and encourage use of metacognitive skills in new or challenging domains.

Effective instructional design acknowledges this student need and necessitates the development of supports to teach or scaffold activation of metacognitive skills. This paper describes the translation of research on metacognition and its role in the learning process through the development of a co-curricular metacognitive program to facilitate learning in a second-year organic chemistry course. We outline both the translation process and the effects of components of this metacognitive program on students’ metacognitive skills and performance in the course.

Key to this program were course elements designed to enhance planning, monitoring, and reflection skills as well as explicit instruction regarding metacognition. When students are presented with tasks, they first need to assess whether the task is familiar. They must ask themselves, “Can I recall doing this type of task or a similar task before?” Memory can then serve as a guide for planning (i.e., what, when, and how to read, engage in study strategies, and organize learning priorities) depending on the learners’ assessment of what they do or do not know about the topic or task. Many types of monitoring activities then follow this initial memory task, for example, checking and assessing performance as a task unfolds. This may require a single simple check for easier tasks but may be an iterative process for challenging tasks. To draw on an organic chemistry example, consider the assignment of R vs. S stereochemistry to a chirality center in an organic molecule. The student must first assess their readiness for this task, e.g., “Have I done this previously?” “Can I recall the steps involved?,” and “Are there particular steps I have struggled with in the past?.” After recalling or reviewing steps required, the student must then execute each step and monitor progress. Have they properly assigned priorities to substituents at the chirality center using the Cahn-Ingold-Prelog rules? Is the molecule being viewed from the proper angle? If not, the student must decide on a mental or physical manipulation to reorient the molecule and judge the effectiveness of that manipulation. Finally, the student must assess the direction in which the remaining substituents decrease in priority and, subsequently, assign R or S stereochemistry. In addition to monitoring the process, the learner must monitor the outcomes. Did the student assign stereochemistry correctly? More precisely, effective monitoring requires that corrections are made after errors are detected. If a mistake in assigning stereochemistry was made, what was the problem? Were substituent priorities assigned correctly? Was the mental/physical manipulation done properly? Was the correct label – R vs. S – assigned based on the direction of decreasing priorities?

Successful planning and monitoring presume that learners are motivated to learn and have sufficient time to learn. Both motivation and executive processing can enhance use of metacognitive skills (Rivas et al., 2022). For example, a highly motivated, self-regulated learner is more likely to schedule study time in advance, arrange an environment conducive to studying, complete assignments, explore additional examples, read, review and summarize, and engage in diligent assessment of what has or has not been learned (Heikkilä and Lonka, 2006). Less motivated and more challenged students require cues or prompts to scaffold these steps and keep them on task. Metacognitive reflection is an effective learning tool (e.g., Bangert-Downs et al., 2004; Dignath and Büttner, 2008) that involves thinking about what, how, and why one does what one does. Through engaging in critical evaluation learners gain new insights and perspectives (e.g., Grimmett and MacKinnon, 1992). To encourage reflection the learning environment must provide opportunities for learners to ‘take stock’ of their own approach to learning.

Why metacognition in organic chemistry?

Given the diversity in metacognitive skills among university students and the potential for domain specificity in application of these skills, students in higher education may require specific training, scaffolding, or cueing to acquire or transfer relevant metacognitive skills across differing domains (Zohar and Dori, 2012). Introductory organic chemistry is typically a prerequisite course for subsequent chemistry and other science studies. As such it is both desirable and feared. Consequently, significant research has been directed toward learning and instruction in organic chemistry (e.g., Kranz et al., 2023; Pilcher et al., 2023). The present paper summarizes steps toward establishing course-specific metacognitive training.

What prompted the development of a metacognition co-curriculum for introductory organic chemistry?

Inspired by the potential inherent in blended learning designs (Garrison and Vaughan, 2008), and tenets of adult metacognition, the traditional lecture-based course in introductory organic chemistry was transformed to a blended learning format. As part of the re-design process comprehensive lists of chapter-by-chapter learning tasks were created with specific low-level tasks, e.g., “define chirality centre” (Blooms revised taxonomy level 1; Krathwohl, 2002), identified as those to be completed independently by learners before class, and other higher-level tasks, e.g., “given the structure of an organic compound, identify all chirality centers” (Blooms revised taxonomy level 4), demarcated as those to be completed, with assistance from other students and the instructor as needed, in class. These task lists served as an organizational scaffold that clearly identified the knowledge and skills required for success. When the blended learning course was launched, these learning task lists were posted as PDFs via the online course management system. Consistent with self-directed models of adult learning (Merriam, 2001), and metacognitive monitoring it was expected that students would access and use this resource to plan and evaluate their progression through the term. However, on average, only 39% of students accessed the PDFs across the term. Given the novelty of these learning task lists, we decided to encourage students to access and review the lists and recognize them as a support for learning and not just extra supplementary materials. In the next course offering, the PDFs were converted to surveys delivered through the course management system and students were instructed to engage in self-assessment of knowledge by indicating on a 5-point scale how well they could do each learning task before they could access online resources for the next chapter. Restricting access to the next week’s content improved participation (> 90% across the term), but students’ thoughtful engagement with the learning tasks remained problematic. For example, when the following item was added to a learning tasks survey, “Select 1 if you are reading this,” only ~50% of students selected 1! This clearly indicated that many students were not engaging with the surveys as anticipated and many were arbitrarily responding just to gain access to the next week’s content. It was at this point that a partnership formed between the instructor (an organic chemist) and an educational/developmental psychologist who introduced the concept of metacognition and speculated that the failures of students to access and engage with the provided resources may stem from metacognitive failure. This led to a discipline-based educational research (DBER) project aimed at improving students’ metacognitive skills starting with an examination of engagement with the learning task lists. The following narrative is a retroactive summary of the DBER project as it unfolded across four consecutive Fall term iterations of a single section, in-person, introductory organic chemistry course (Organic Chemistry I) taught by the same instructor.

Motivations, methods, and results to date

Phase I

The first step in the DBER project was to determine optimal conditions for having students engage with the LTIs. In total, 293 students of the 311 students registered in the Organic Chemistry I course agreed to participate and were randomly assigned to five treatment groups, each of which was provided nine weekly LTIs across the term and, at the end of the week, asked to rate their ability to complete each task using the same 5-point mastery scale noted above. Students in Condition 1 self-assessed domain knowledge by completing the LTI ratings. To gage the fidelity of the LTI rating measure, students in Condition 2 completed the LTI ratings including a question imbedded in the LTI that assessed how carefully students were completing the ratings (e.g., “Are you reading each task carefully or clicking away without doing so?”). We then added three additional conditions to assess students’ performance when tested on their knowledge and provided with varying levels of feedback. To evaluate the effect of testing (Roediger and Karpicke, 2006) and feedback (Pashler et al., 2005; Fazio et al., 2010), participants in Conditions 3–5 completed the LTI ratings and fidelity measure, followed by a 5-question multiple choice quiz. This permitted us to compare perceived mastery to actual performance for five learning tasks per LTI. Participants in Condition 3 received no feedback on their quiz performance; those in Condition 4 were provided the correct answers with no explanations after each quiz; and those in Condition 5 were provided the correct answers with full explanations after each quiz. These students also responded to a 10-item post-quiz survey assessing students’ perceptions of quiz difficulty, comparison of LTI ratings to quiz performance, and changes in quiz difficulty, content covered, and student engagement and interest from week to week (see Supplementary material). All students also completed an end-of-term survey to ascertain reactions to the LTIs and provide general chemistry grades and GPAs as measures of prior learning.

The major findings were that (i) treatment condition did not affect final exam grades but (ii) the number of LTIs completed did, with completion of more LTIs leading to higher final exam grades (See Table 2 in MacNeil et al. (2013)). The effect of number of LTIs completed was over and above that which could be explained by differences in prior learning. In addition, 72% of students indicated they would recommend using LTIs in future offerings of the course and most participants felt that LTIs improved awareness of learning tasks they could not do. However, when asked about the impact of using the LTIs on changing study habits or improving grades, most students attributed only a small (56.6%) or no (28.1%) impact on final grades and very few students reported any impact on study habits, with 80.4% indicating no impact. Overall, these results confirmed the value of the LTIs as a tool for supporting learning but clearly students did not perceive differences in metacognitive skills associated with planning, monitoring or evaluation.

Phase II

Given the documented effects of testing (Roediger and Karpicke, 2006) and type of feedback (Pashler et al., 2005; Fazio et al., 2010) on learning, it was surprising to see in Phase I that treatment condition had no effect on final exam grades, but the overall positive effect of number of LTIs completed on final exam grades was encouraging. This result prompted us to assess whether completion of LTIs also improved metacognitive skills more explicitly. Consequently, 211 students of the 310 students registered in the next offering of Organic Chemistry I participated in Phase II of the DBER project. To measure metacognitive skills directly, students completed a metacognitive awareness assessment scale (Schraw and Dennison, 1994) at the beginning and a condensed version at the end of the term. Given the documented benefits of priming (Ratcliff et al., 1997) and distributed practice (Benjamin and Tullis, 2010) on learning and performance, we also wanted to test the effects of ‘priming’ and timing of LTIs on participants’ metacognitive skills. Thus, four treatment conditions were employed. Participants in Conditions 1 and 2 completed weekly LTIs (distributed practice) as in Phase I, but those in Condition 2 were primed with a list of LTI items at the beginning of each week posted as a PDF on the course management system. These listed items served as cues to prime students for the full LTIs that would be required later in the week. Participants in Conditions 3 and 4 completed only two aggregated LTIs (massed practice), one during the week of the midterm test and one during the week of the final exam, with those in Condition 4, like Condition 2, primed to these aggregated LTIs via the corresponding aggregated inventory cue posted to the course management system at the beginning of the week. The “Group 5” LTI conditions from Phase I were employed, i.e., completion of the basic LTI with metacognitive prompts, e.g., “As you read each learning task, do you think of it as a possible question that you might be tested on?,” to test the reliability of students’ mastery ratings, followed by a quiz with full feedback and post-quiz survey, with one exception – instead of rating their mastery for individual learning tasks, participants rated their abilities for groups of 6 learning tasks so as to reduce the likelihood of survey fatigue. Surprisingly, there were no differences across condition (primed versus not primed, or massed versus distributed) with respect to number of LTIs completed or exam performance, nor were there differences across the term for the three measures of metacognitive awareness: knowledge of cognition, regulation of cognition, and overall metacognitive awareness.

Phase III

Given the lack of observed differences from priming and distributed vs. massed practice with respect to students’ self-reported metacognitive skills across the term in Phase II, we decided to make metacognition a more explicit part of the course. Specifically, we made five changes for Phase III including (i) introduction of a lecture on metacognition (ii) reversion to weekly LTIs based on the previously observed finding that completion of more LTIs lead to higher final exam grades, (iii) addition of a weekly in-class question at the beginning of each week to assess how students interacted with the priming inventory cues, (iv) expansion of the end-of-term metacognitive awareness assessment to include all items used in the beginning-of-term metacognitive awareness assessment (note that only about half of the items were used at the end-of-term in Phase II), and (v) addition of more metacognitive practice activities in the form of midterm test and final exam grade predictions and postdictions.

Consistent with literature regarding effective metacognitive training (Cook et al., 2013), the addition of the lecture on metacognition explicitly cued students to use metacognitive skills (planning, monitoring, self-assessment, etc.) when engaged in their chemistry course. The addition of a weekly in-class question at the beginning of each week to assess how students interacted with the inventory cues served to prime and encourage self-reflection and assessment. Thus, each week at the beginning of the first class, participants were asked via the iClicker® personal response system “What did you do with the Chapter ‘X’ inventory cue posted to the ‘Content’ section of the CH202/204 MyLS page?” Possible answers were: A. I did not know, or I forgot it was available. B. I know it was available, but I did not access it. C. I skimmed it briefly. D. I read it carefully, and E. I read it carefully and prepared a physical/mental check list of what I could/could not do. This priming was expected to lead to improved engagement with the inventory cues at the beginning of the week and, consequently, with the LTIs at the end of each week. It was anticipated that this priming and improved engagement should, in turn, lead to improved use of metacognitive skills and performance. In total, 238 students of the 296 registered in the next offering of Organic Chemistry I participated in Phase III. Across the term, students accessed the inventory cues 37.4% of the time, indicating that the inventory cues prompted students to preview the material only some of the time. Analyses indicated that “skimmed” (response C) was the most frequently endorsed choice among “middle” (final course grade: 60–79.99%; 39.1% endorsing) and “bottom” (final course grade: 9.99–59.99%; 39.1% endorsing) performers, and ‘no access’ (response B) was the most frequently endorsed choice among “top” (final course grade: 80–100%; 38.9% endorsing) performers (See Supplementary Information for a more complete summary of select analyses for Phase III). Although effective use of the inventory cues decreased across the term (See Supplementary Information Phase III summary), regression analysis revealed that the number of inventory cues accessed predicted the number of LTIs completed (t228 = 3.292, p = 0.001) which, in turn, had a significant effect on final course grades, with participants completing more LTIs earning higher course grades (t228 = 2.72, p = 0.007). Noteworthy is that the effect size for LTIs was larger than for all three prior learning variables (Chemistry 1 and 2 grades and overall GPA) (See Supplementary Information Phase III summary). Unfortunately, access to inventory cues had no effect on participants’ self-reported metacognitive skills.

Given the importance of self-evaluation as a metacognitive skill (Hacker et al., 2000), one group of participants (Group A) was asked to make midterm test and final exam grade predictions immediately after each test/exam, i.e., postdictions. A second group of participants (Group B) was asked to make midterm test and final exam grade predictions immediately before (same day as) and immediately after completing each test/exam. Finally, a third group of participants (Group C) was asked to make midterm test and final exam grade predictions 2 weeks before, immediately before (same day as), and immediately after completing each test/exam. These test/exam grade predictions and postdictions proved important to students’ metacognitive skills in that a Dunning Kruger effect (Kruger and Dunning, 1999) was observed for all pre-and postdictions, with the lower achieving students grossly overestimating their grades and higher achieving students slightly underestimating their grades, but this effect decreased over time. For example, statistical significance was achieved for Group 3’s “day of” predictions between midterm test 1 and 2 (F(2,46) = 10.21, p < 0.001) and all Groups’ postdictions between midterm test 1 and 2 (F(1,159) = 18.03, p < 0.001), with predictions being more accurate at the later time points. A reduction in the Dunning Kruger effect suggests improvements in metacognition. Interestingly, participants reported an overall decrease in metacognition from beginning to end of term, and treatment group did not affect test, exam, or final course grades.

Phase IV

With several positive results realized through Phases I-III of the DBER project, a formal metacognition co-curriculum was implemented during Phase IV. For the first time, all 289 students registered in the course were expected to complete metacognitive practice activities as a required component of the course. Of these 289 students, 259 agreed to have their data analyzed. The metacognition co-curriculum accounted for 10% of the final course grade and was composed of the following activities: (i) an introductory survey including a metacognitive awareness assessment (1%); (ii) an instructor-developed prior learning assessment (2.5%); (iii) 8 weekly inventory cues (1% if at least 6 of 8 were completed); (iv) 8 weekly LTIs (1% if at least 6 of 8 were completed), (v) weekly in-class confidence questions (2.5%); (vi) 9 test grade prediction and postdiction surveys, including 3 one-week predictions, 3 day of predictions, and 3 postdictions immediately following each test (1% if at least 7 of 9 were completed); and (vii) an end-of-term survey including a metacognitive awareness assessment (1%).

The instructor-developed prior learning assessment consisted of 22 multiple-choice questions based on relevant material from the prerequisite general chemistry courses. Students first completed the assessment with no preparation and for no points and could immediately review their results. Students were then granted access to relevant course resources and provided the opportunity to review material before completing the assessment again 1 week later, this time for 2.5% of their final course grade. This assessment served as an introduction to the metacognition co-curriculum, prompting students to reflect on prior learning while completing the first iteration of the assessment, then self-assess and review only the content for which they struggled in preparation for the second iteration.

The addition of weekly in-class confidence questions was inspired by a report on the use of formative assessment and self-regulated learning in mathematics (Hudesman et al., 2014). In each class, whenever a content-based question was posed (typically 3–4 per class), students would first be asked to rate their level of confidence in responding to the question on a 5-point scale. For example, if the content question was to indicate the number of chirality centers in an organic molecule, students would be told what the question is but would be asked to rate their confidence before being shown the structure. Immediately following the collection of responses to the content question, students would be asked to rate their level of confidence in having responded correctly to the question before being shown the correct answer. These confidence questions are similar but distinct in that the first prompts students to summon and assess prior learning, whereas the second asks them to judge performance.

Major findings from Phase IV include (i) a clear Dunning-Kruger effect across the term for test grade pre-and postdictions, with (a) low-performing students overestimating grades and high-performing students underestimating, (b) the overall effect decreasing over time, both within test and between test, (c) low-achieving students becoming more accurate over time but high-achievers becoming less accurate, and (d) day of predictions and postdictions being significant predictors of actual test scores (e.g., for the final exam, F(3,140) = 45.18, p < 0.001; See Supplementary Information Phase IV findings); and (ii) an increase in confidence for both pre-and post-question confidence questions when average confidence ratings before midterm test 1 were compared to average confidence ratings after midterm test 2 (e.g., the significant main effect for pre-question confidence was F(2,244) = 105.06, p < 0.001). Interestingly, and consistent with previous research (Hacker et al., 2000), low performing students on the final exam were those who tended to be overly confident on pre-question ratings – even after controlling for GPA (e.g., for average pre-question confidence prior to midterm test 1, β = −4.46, t = −2.84, p = 0.005). Overestimations in confidence for post-questions, however, were no longer evident and post-question responses did not predict actual test grades for any of the tests. Finally, and once again, metacognitive awareness assessment scores were observed to decrease from beginning-to end-of-term.

Discussion

The key aim of the present report was to describe the development of a metacognition co-curriculum that was implemented to scaffold and enhance students’ use of metacognitive skills in an Organic Chemistry course. An important part of the translation from the literature to practice involved quasi-experimental analyses to better assess change, challenges, and successes. Borrowing from literatures spanning education, cognition and instructional psychology, a series of supports were integrated into ongoing classroom instruction over four offerings of the same course. The translation process was iterative and progressive culminating in a true co-curriculum of metacognition for students in the course. A salient outcome of the process was that supporting students’ application of metacognitive skills enhanced their course performance. Perhaps more important is that the metacognitive co-curriculum supports an array of skills tied to metacognition that students may or may not have and that engaging in these skills affirms the importance of these advanced learning behaviors which may also be transferred to other domains they study.

Several supports were introduced in this program including priming for content accompanied by self-assessment of competence with learning task inventories (LTIs), self-assessments of confidence regarding in-class content questions, and performance predictions and postdictions on tests. In general, exposure to these supports resulted in overall performance and confidence gains.

Scaffolding metacognitive skills by providing students with the full inventories of learning tasks by the end of the week and subsequently with LTI cues at the beginning of a week was associated with learning gains as evidenced by improvements in final exam or final course grades. These interventions support students’ ability to assess their declarative knowledge which allows them to control and direct subsequent cognitive processes involving planning, monitoring, and evaluating their acquisition of these concepts (Dimmit and McCormick, 2012; Rivas et al., 2022). However, availability of these supports is not sufficient. When they were simply available, students did not access them. When students were required to access them, their grades improved. Interestingly, many students indicated skimming the initial “cuing” lists. At a glance, this may seem to signify minimal engagement. However, advanced learners should be skimming such content to quickly assess familiarity with the concepts as a first step in processing what they do or do not know. To engage students more deeply, they need to be encouraged to try to solve the tasks, evaluate their attempts, and make corrections as needed. This was facilitated through two mechanisms, introducing mastery questions to have students solve problems and subsequently pairing these content questions with confidence questions asked before and after answering the content questions during class. These components encouraged students to engage their procedural and conditional knowledge skills to access strategies and problem solving specific to the question. The confidence questions also encouraged further assessment in these knowledge areas through self-reflection regarding their ability to respond to the questions and their performance once they answered questions (Bangert-Downs et al., 2004; Dignath and Büttner, 2008). These adaptations supported students in the learning process. The test and exam grade predictions and postdictions assisted students in their application of metacognitive skills in the assessment process. Given that calibrations became more accurate over practice is evidence that students were better able to assess their learning. Instating these estimations before, during, and after testing extended use of metacognitive skills throughout the process.

An important consideration when shifting from testing different iterations to translating the metacognitive scaffolds into a full co-curriculum was the intentional assignment of grades for completion of the co-curriculum elements. Although learners may be expected to adopt supports when they are made available, it is clear from our studies that students may not engage in best practice especially in courses where attention to anything but content may be perceived to be costly. Given demands inherent in learning the course content, students may perceive that the co-curriculum presents additional, unsurmountable demands that could cost performance. Evidence of this reluctance was clear through the limited uptake of the LTIs when review was made optional and through the dismissal of the LTI content by students performing most poorly in the course. Thus, to signal to students that the co-curriculum is valuable and worthy of their efforts, aspects were assigned course weight. This allowed students to engage and embrace the co-curriculum. Developing these skills in the context of a required introductory organic chemistry course provides a foundation for generalizing the skills learned through this course to more senior courses in this domain but also for transfer across domains.

As part of the process of creating the metacognitive co-curriculum we identified challenges translating some evidence-based and evidence proven interventions identified as supports in the extant literature. For example, there were no differences across condition for those engaged in massed versus distributed practice with the LTIs, priming worked sometimes but not consistently, and ratings for aspects of metacognitive awareness tended to decrease over time. Despite the lack of support for distributed practice in the one instance where we tested it, we adopted it as part of the co-curriculum given substantial work showing its value (e.g., Donovan and Radosevich, 1999). Our failure to detect differences in these two study approaches may have been an artifact of the many manipulations being tested at the same time which may have masked effects. Similarly, the initial lack of priming effects (when tested in Phase II) was evident in later iterations when cueing and self-evaluation were the greater focus of study. Regardless, priming with the list cues at the beginning of the week was retained in the co-curriculum. Initially, we were concerned to see a drop in students’ self-reported metacognitive awareness scores from the beginning to the end of term. However, the reduction in the Dunning-Kruger effect and enhanced calibration in prediction and postdiction measures suggests that the decrease in students’ evaluation of their metacognitive skills may reflect a shift from an overly optimistic view of their abilities to a more realistic evaluation as a function of engaging in so many relevant activities.

Conclusion, constraints, and future work

Overall, at the class level, the addition of the metacognition co-curriculum improved students’ performance as indicated by final exam and final course scores, and improved metacognitive skills, as indicated by the reduction in the Dunning-Kruger effect. The optimistically high ratings of the in-class confidence question results, particularly when coupled with the negative correlation to final exam scores, suggest a possible self-preservation mechanism that hints at the importance of considering the affective domain of learning. Results also indicate that not all students interacted with the elements of the metacognition co-curriculum to the same extent. This may be expected given individual differences in metacognitive skills and especially with these skills applied to different domains. It may be the case that the co-curriculum benefits some students more than others. For example, more advanced students may experience less benefit, likely because they have already developed and automatically use the skills that the metacognition co-curriculum aims to foster. Examining individual differences as a function of the co-curricular program in general and as a function of different components of the co-curriculum would be ideal to establish what works best for which learners under which circumstances. Being able to assess performance beyond a single term also permits evidence of changing needs over time which would be another important avenue of exploration. Finally, one aspect that the present model of metacognitive co-curriculum has not assessed is social metacognition, the sharing of metacognitive responsibilities when students engage interactively with one another when working in partnership or groups (e.g., Chiu and Kuo, 2010). Given the shift to active and student-centered learning, this additional aspect of metacognition would be interesting to explore and would contribute to a broader understanding of metacognitive skill development in the active classroom.

The present study describes the development of a co-curricular metacognitive program for chemistry students that can be adapted and extended to other chemistry courses or other domains in higher education. Instructors wanting to implement this metacognitive co-curriculum can follow the elements outlined above. Instructors will need to prepare an introduction to metacognition for their students which can be achieved using available resources (see Cook et al., 2013). Instructors will need to create a prior learning assessment based on the prerequisite course(s) which can be drawn from examination of course outlines of these courses. Finally, instructors need to develop detailed lists of learning tasks associated with their course. All other components of the metacognition co-curriculum identified above (having students predict their grades at planned times during the course and assess confidence in answering content-based in-class questions), can be implemented with minimal preparation. In general, over our series of studies, exposure to these supports resulted in overall performance and confidence gains, and these results should transfer to other chemistry courses and domains.

Data availability statement

The data sets presented in this article are not readily or publicly available. However, anonymized versions of some of the data sets may be available upon reasonable written request to the first author. Ethics approval and informed consent for some of the data reported here did not include sharing of data sets. In these situations, those specific aspects of the data and other identifiable information (gleaned from open ended answers where participants did not provide permission for quotes or use) may not be shared. Requests to access the data sets should be directed to SM, smacneil@wlu.ca.

Ethics statement

The studies involving humans were approved by the Research Ethics Review Board, Wilfrid Laurier University. The studies were conducted in accordance with the local legislation and institutional requirements. The participants provided their written informed consent to participate in the studies described here.

Author contributions

SM: Writing – original draft, Writing – review & editing. EW: Writing – original draft, Writing – review & editing. FA: Writing – original draft.

Funding

The author(s) declare that no financial support was received for the research, authorship, and/or publication of this article.

Acknowledgments

We thank the undergraduate students who assisted with data collection and analysis associated with this DBER project, including Patrick Smith, Robyn Glover, Hanna Douglas, Deanna MacNeil, Brandon Mullen, Brittany Doerr, Andrea Debrouwer, Mike Urschel, and Tarique Plummer.

Conflict of interest

The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

Publisher’s note

All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article, or claim that may be made by its manufacturer, is not guaranteed or endorsed by the publisher.

Supplementary material

The Supplementary material for this article can be found online at: https://www.frontiersin.org/articles/10.3389/feduc.2024.1402599/full#supplementary-material

SUPPLEMENTARY DATA SHEET 1 Representative measures and selected data.

References

Bangert-Downs, R. L., Hurley, M. M., and Wilkinson, B. (2004). The effects of school-based writing-to-learn interventions on academic achievement: a meta-analysis. Rev. Educ. Res. 74, 29–58. doi: 10.3102/00346543074001029

Crossref Full Text | Google Scholar

Benjamin, A. S., and Tullis, J. (2010). What makes distributed practice effective? Cogn. Psychol. 61, 228–247. doi: 10.1016/j.cogpsych.2010.05.004

PubMed Abstract | Crossref Full Text | Google Scholar

Chiu, M. M., and Kuo, S. W. (2010). From metacognition to social metacognition: similarities, differences, and learning. J. Educ. Res. 3, 321–338.

Google Scholar

Cook, E., Kennedy, E., and McGuire, S. (2013). Effect of teaching metacognitive learning strategies on performance in general chemistry courses. J. Chem. Educ. 90, 961–967. doi: 10.1021/ed300686h

Crossref Full Text | Google Scholar

Dignath, C., and Büttner, G. (2008). Components of fostering self-regulated learning among students. A meta-analysis on intervention studies at primary and secondary school level. Metacogn. Learn. 3, 231–264. doi: 10.1007/s11409-008-9029-x

Crossref Full Text | Google Scholar

Dimmit, C., and McCormick, C. B. (2012). “Metacognition in education” in APA educational psychology handbook. eds. K. R. Harris, S. Graham, T. Urdan, C. B. McCormick, G. M. Sinatra, and J. Sweller (Washington, DC: American Psychological Association).

Google Scholar

Donovan, J. J., and Radosevich, D. J. (1999). A meta-analytic review of the distribution of practice effect: now you see it, now you don't. J. Appl. Psychol. 84, 795–805. doi: 10.1037/0021-9010.84.5.795

Crossref Full Text | Google Scholar

Fazio, L. K., Huelser, B. J., Johnson, A., and Marsh, E. J. (2010). Receiving right/wrong feedback: consequences for learning. Memory 18, 335–350. doi: 10.1080/09658211003652491

PubMed Abstract | Crossref Full Text | Google Scholar

Flavell, J. H. (1979). Metacognition and cognitive monitoring: A new area of cognitive–developmental inquiry. Am. Psychol. 34, 906–911. doi: 10.1037/0003-066X.34.10.906

Crossref Full Text | Google Scholar

García-Pérez, D., Fraile, J., and Panadero, E. (2021). Learning strategies and self-regulation in context: how higher education students approach different courses, assessments, and challenges. Eur. J. Psychol. Educ. 36, 533–550. doi: 10.1007/s10212-020-00488-z

Crossref Full Text | Google Scholar

Garrison, R. D., and Vaughan, N. D. (2008). Blended learning in higher education: framework, principles, and guidelines. San Francisco: Jossey-Bass.

Google Scholar

Geurten, M., Meulemans, T., and Lemaire, P. (2018). From domain-specific to domain-general? The developmental path of metacognition for strategy selection. Cogn. Dev. 48, 62–81. doi: 10.1016/j.cogdev.2018.08.002

Crossref Full Text | Google Scholar

Grimmett, P., and MacKinnon, A. (1992). “Craft Knowledge and the education of teachers” in Review of research education. ed. C. Grant (Washington, DC: American Educational Research Association), 385–456.

Google Scholar

Hacker, D. J., Bol, L., Horgan, D. D., and Rakow, E. A. (2000). Test prediction and performance in a classroom context. J. Educ. Psychol. 92, 160–170. doi: 10.1037/0022-0663.92.1.160

Crossref Full Text | Google Scholar

Heikkilä, A., and Lonka, K. (2006). Studying in higher education: students' approaches to learning, self-regulation, and cognitive strategies. Stud. High. Educ. 31, 99–117. doi: 10.1080/03075070500392433

Crossref Full Text | Google Scholar

Hudesman, J., Crosby, S., Ziehmke, N., Everson, H., Isaac, S., Flugman, B., et al. (2014). Using formative assessment and self-regulated learning to help developmental mathematics students achieve: a multi-campus program. J. Excel. Coll. Teach. 25, 107–130.

Google Scholar

Kranz, D., Schween, M., and Graulich, N. (2023). Patterns of reasoning – exploring the interplay of students’ work with a scaffold and their conceptual knowledge in organic chemistry. Chem. Educ. Res. Pract. 24, 453–477. doi: 10.1039/D2RP00132B

Crossref Full Text | Google Scholar

Krathwohl, D. R. (2002). A revision of Bloom’s taxonomy: an overview. Theory Pract. 41, 212–218. doi: 10.1207/s15430421tip4104_2

Crossref Full Text | Google Scholar

Kruger, J., and Dunning, D. (1999). Unskilled and unaware of it: how difficulties in recognizing one’s own incompetence lead to inflated self-assessments. J. Pers. Soc. Psychol. 77, 1121–1134. doi: 10.1037/0022-3514.77.6.1121

PubMed Abstract | Crossref Full Text | Google Scholar

MacNeil, S., Wood, E., Zivcakova, L., Glover, R., and Smith, P. (2013). Learning task inventories (LTIs). Exploration of optimal conditions to help students develop, improve and sustain good study and learning practices. Coll. Essays Learn. Teach. 2, 1–9. doi: 10.22329/celt.v7i2.3976

Crossref Full Text | Google Scholar

Merriam, S. B. (2001). Andragogy and self‐directed learning: pillars of adult learning theory. New Directions for Adult and Continuing Education 89, 3–14. doi: 10.1002/ace.3

Crossref Full Text | Google Scholar

Pashler, H., Cepeda, N. J., Wixted, J. T., and Rohrer, D. (2005). When does feedback facilitate learning of words? J. Exp. Psychol. Learn. Mem. Cogn. 31, 3–8. doi: 10.1037/0278-7393.31.1.3

PubMed Abstract | Crossref Full Text | Google Scholar

Pilcher, L. A., Potgieter, M., and Fletcher, L. (2023). Blending online homework and large class tutorials to provide learning support for introductory organic chemistry. Afr. J. Res. Math., Sci. Technol. Educ. 27, 1–13. doi: 10.1080/18117295.2022.2155771

Crossref Full Text | Google Scholar

Ratcliff, R., Allbritton, D., and McKoon, G. (1997). Bias in auditory priming. J. Exp. Psychol. Learn. Mem. Cogn. 23, 143–152. doi: 10.1037/0278-7393.23.1.143

Crossref Full Text | Google Scholar

Rivas, S. F., Saiz, C., and Ossa, C. (2022). Metacognitive strategies and development of critical thinking in higher education. Front. Psychol. 13:913219. doi: 10.3389/fpsyg.2022.913219

PubMed Abstract | Crossref Full Text | Google Scholar

Roediger, H. L., and Karpicke, J. D. (2006). Test-enhanced learning: taking memory tests improves long-term retention. Psychol. Sci. 17, 249–255. doi: 10.1111/j.1467-9280.2006.01693.x

Crossref Full Text | Google Scholar

Saks, K., Ilves, H., and Noppel, A. (2021). The impact of procedural knowledge on the formation of declarative knowledge: how accomplishing activities designed for developing learning skills impacts teachers’ knowledge of learning skills. Educ. Sci. 11:598. doi: 10.3390/educsci11100598

Crossref Full Text | Google Scholar

Schraw, G., Crippen, K., and Hartley, K. (2006). Promoting self-regulation in science education: metacognition as part of a broader perspective on learning. Res. Sci. Educ. 36, 111–139. doi: 10.1007/s11165-005-3917-8

Crossref Full Text | Google Scholar

Schraw, G., and Dennison, S. R. (1994). Assessing metacognitive awareness. Contemp. Educ. Psychol. 19, 460–475. doi: 10.1006/ceps.1994.1033

Crossref Full Text | Google Scholar

Scott, B. M., and Berman, A. F. (2013). Examining the domain-specificity of metacognition using academic domains and task-specific individual differences. Aust. J. Educ. Dev. Psychol. 13, 28–43.

Google Scholar

Zohar, A., and Dori, Y. J. (2012). “Metacognition in science education” in Trends in Current Research. ed. D. Ziedler (Dordrecht: Springer), 1–19.

Google Scholar

Keywords: metacognition, organic chemistry, discipline-based education research, higher education, prediction and postdiction, scaffolded learning

Citation: MacNeil SL, Wood E and Arslantas F (2024) Development of a metacognition co-curriculum for a university course in introductory organic chemistry. Front. Educ. 9:1402599. doi: 10.3389/feduc.2024.1402599

Received: 17 March 2024; Accepted: 01 May 2024;
Published: 06 June 2024.

Edited by:

Brett McCollum, Thompson Rivers University, Canada

Reviewed by:

Jessica D'Eon, University of Toronto, Canada
W. Stephen McNeil, University of British Columbia, Okanagan Campus, Canada

Copyright © 2024 MacNeil, Wood and Arslantas. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

*Correspondence: Stephen L. MacNeil, smacneil@wlu.ca

Disclaimer: All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article or claim that may be made by its manufacturer is not guaranteed or endorsed by the publisher.