Skip to main content

CURRICULUM, INSTRUCTION, AND PEDAGOGY article

Front. Educ., 16 May 2024
Sec. STEM Education
This article is part of the Research Topic Organic Chemistry Education Research into Practice View all 10 articles

Exploring alternative assessments during COVID: Instructor experiences using oral exams

  • 1Department of Chemistry, Haverford College, Haverford, PA, United States
  • 2Department of Chemistry and Physics, Monmouth University, West Long Branch, NJ, United States

Research into conventional summative assessment methods, such as written exams, has been extensively documented in the literature. However, as academia evolves in response to changing dynamics, there is a growing demand for more robust summative assessment approaches. Oral exams have emerged as a noteworthy form of summative assessment with intrinsic formative qualities, enabling instructors to delve deeply into students’ comprehension within a meaningful learning framework. Considering the constraints imposed by the traditional written examination format during the COVID pandemic, two educators implemented oral assessments in their chemistry courses, one in general chemistry and the other in organic chemistry. This article presents a comprehensive account of their approach, course structure, rubrics, documentation procedures, and the challenges associated with implementing oral exams. Furthermore, the authors offer insights derived from perceived course outcomes, experiences, collaborative efforts, and reflections from this transformative process. Through candid exploration, this article delves into both the potential advantages and the hurdles associated with the adoption of oral exams in chemistry education. It serves as a valuable resource for educators seeking innovative assessment strategies.

1 Introduction

The use of oral exams as an alternative assessment method in chemistry education has gained some attention in recent years (Ramella, 2019). Traditionally, chemistry courses rely on written examinations to evaluate students’ understanding of concepts, problem-solving abilities, and growing expertise with course material; however, it has become increasingly evident that written exams alone may not fully capture the breadth and depth of students’ knowledge and skills in chemistry (Tienson-Tseng, 2019). Oral exams have been used as a means to provide a more comprehensive evaluation of students’ learning for a century (Muldoon, 1926), and have the potential to offer unique advantages, such as the opportunity for direct interaction between the examiner and student, the ability to assess critical thinking and communication skills, and the potential to create a more engaging and inclusive assessment environment. Herein, we delve into the use of oral exams in chemistry education, their benefits, challenges, and potential impacts on student learning outcomes. By exploring the practical aspects, pedagogical implications, and empirical evidence surrounding the use of oral exams, we aim to shed light on the significance of this alternative assessment method as a means to leverage research to inform practice.

1.1 Background and rationale

While oral examinations are not a novel method of assessing students (Muldoon, 1926), their integration into larger courses (twenty or more students) is not commonly practiced (Lubarda et al., 2021); most learners encounter formal oral evaluations in graduate school, professional school or in a job interview for the first time (Dicks et al., 2012). Recent discussions in the American Chemical Society (ACS) symposium series have highlighted the benefits of oral examinations, emphasizing their ability to thoroughly examine understanding and facilitate meaningful student-faculty discussions through leading questions (Ramella, 2019). Literature suggests these assessments have the potential to offer valuable insights into students’ problem-solving skills and information processing abilities (Bowen, 1994; Orgill and Sutherland, 2008; Walker and Sampson, 2013). Students are often more prepared for oral discussions due to the face-to-face nature of the examination, which motivates them to engage with the material at a deeper level—the assessment requires students do more than just memorize and regurgitate—resulting in better responses (Hambrecht, 2003; Boedigheimer et al., 2015; Iannone and Simpson, 2015). Similarly, instructors are able to engage with students and guide them in their critical thinking on a deep level in a way that written exams cannot facilitate (Gent et al., 1999).

Although there are only a few documented examples of oral assessments in the chemical education literature (e.g., Muldoon, 1926; Roecker, 2007; Dicks et al., 2012; Crawford and Kloepper, 2019), existing reports indicate positive student experiences with this evaluation method in both inorganic and organic chemistry classes (Roecker, 2007; Crawford and Kloepper, 2019). Burrows et al. (2021) found that students had a positive overall experience with oral assessments in laboratories, which is consistent with previous research on oral assessments in lecture courses (Roecker, 2007). Students specifically felt a sense of accountability for their preparation during lab interviews, which motivated them to avoid exposing gaps in their knowledge. This individual accountability led to active participation and knowledge construction during the lab session (Burrows et al., 2021).

When teaching, we encourage students to think in different ways with different tools. We think with our bodies and physical models (Gilbert and Treagust, 2009; Flood et al., 2015; Kiste et al., 2016); we think with the physical spaces we occupy and the digital spaces we can manipulate (Kozma and Russell, 2005). We also think by utilizing the people around us as tools. As educators, we encourage students to think with their peers collaboratively, and as they continue to grow into chemists, teach them to think in conjunction with the broader community to tackle big problems by writing, sharing, and discussing our work to build our collective knowledge—shared attention, in collaboration broadly (Gregory and Jackson, 2017), or in the same physical space (Shteynberg and Apfelbaum, 2013) increases our capacity to learn.

When thinking by utilizing the people around us, students learn to think by leveraging experts around them as tools (Paul, 2021). Novices can build expertise by gaining insights into the internal thought processes of experts (Collins et al., 1991); additionally, these expert-novice interactions are places where learners and educators can make the invisible parts of learning visible (Paul, 2021), and figure out next steps to take to support the learner’s growing expertise—which is in line with constructivist teaching practices (Ausubel et al., 1978; Bodner, 1986) and not dissimilar to cognitive apprenticeships (Collins et al., 1991). Oral exams are a way to intentionally allocate time to facilitate these novice-expert interactions that use a variety of modes of thinking.

1.2 Concerns about oral exams

Two concerns are frequently cited in education literature regarding oral exams: (1) student anxiety and (2) fairness (Iannone and Simpson, 2015). In Huxham et al.’s (2012) study, focus group testimonies revealed some students felt anxious because they recognized that they needed a deep understanding to explain concepts to their examiner in a way that was unique to oral exams. A similar sentiment was expressed by students who felt exposed during the oral assessments; this was also expressed by students in other studies (Iannone and Simpson, 2015). Within social work education literature, there’s some evidence that increasing a students’ experiences and familiarity with this type of assessment can lessen some of the anxiety they experience on subsequent oral assessments (Henderson et al., 2002; Huxham et al., 2012; Iannone and Simpson, 2015). This is especially noteworthy for STEM fields as most students do not encounter oral assessments in their undergraduate careers (Goodman, 2020). Despite how anxiety-inducing oral exams might be (Díaz et al., 2001; Huxham et al., 2012; Iannone and Simpson, 2015), literature suggests that some anxious learners still prefer oral assessments over traditional, written exams (Huxham et al., 2012).

Unlike written exams where students gather together in a room to take it, the one-on-one nature of oral exams gives opportunity for bias or prejudice to occur in a way that isn’t there for regular exams (Heyneman et al., 2008). Oral examinations are not anonymous in the way a written exam can be to reduce bias. There are concerns that this leaves oral exams especially prone to bias and unfairness (Davis and Karunathilake, 2005). But as Iannone and Simpson (2015)express in their publication, bias is not always eliminated in written assessments, and it will be hard to compare bias remaining in each assessment type.

Due to the conversational nature of an oral exam, each conversation an examiner has with a student will be different—even if they are discussing the same questions. The examiner explores student knowledge so gaps in knowledge can be noted; however, gaps in knowledge don’t hinder a student from demonstrating other knowledge. In an oral session, a student can move through, or around, stumbling blocks in the discussion to demonstrate other knowledge and insight in a way that’s not possible on a written exam. In these scenarios, one student may receive a guiding question in one session that another student doesn’t in theirs, but their overall outcomes might be the same. Some students have reported feeling this is an issue of fairness, but others saw this as a mode to get targeted feedback and correct course in progress (Iannone and Simpson, 2015).

Oral exams might also provide another mode of equitable assessment as a learning tool. For learners who are studying in languages that aren’t their native language, an oral assessment may allow them to use their strongest mode of communication (speech) to their advantage in a way they might not be able to on a timed, written exam (Huxham et al., 2012; Ramella, 2019). Oral exams might offer more equitable modes of communicating and assessing knowledge for students who experience dyslexia, dyscalculia or dysgraphia (Waterfield and West, 2005; Huxham et al., 2012); however, there are other factors that must be considered when conducting oral exams such as the types of power dynamics and spaces are being used to facilitate the assessment (Theobold, 2021).

In a study with biology students, students who took the oral exam scored significantly higher on their assessment when compared to students who took the written assessment (Huxham et al., 2012). Students expressed that there was an authenticity to the format that made them feel like a professional (Joughin, 1998). While students take many written examinations in their undergraduate careers, once they move onto their professional lives, they will be expected to communicate in both written and oral formats to convey their ideas. Oral exams might be an important key to helping them build an aesthetic (fitting) and authentic professional identity.

Oral examinations provide an opportunity to rigorously probe understanding as leading questions can be asked that facilitate novice-expert or student-faculty discussions and interactions. The benefits of student–faculty interaction are numerous, including objective increases in grade point averages and in matriculation to post-graduate studies (Cotten and Wilson, 2006). Dr. Nikita Burrows and Dr. Theresa Gaines implemented oral exams in their chemistry courses, general chemistry, and organic chemistry, respectively, in the 2020–2021 academic year. In this work we will discuss our experiences and how we designed our courses to facilitate oral assessments in chemistry classrooms as evidence-based practices.

For more advice on implementing oral exams, we recommend looking at A Short Guide to Oral Assessments (Joughin, 2010) and Oral Exams: A More Meaningful Assessment of Students’ Understanding (Theobold, 2021).

2 Pedagogical frameworks

The primary theory guiding our educational practice is the theory of constructivism—a theory that describes how we gain knowledge. Constructivism purports that we actively piece together knowledge from our experiences (Bodner, 1986)—and that learning is meaningful when we integrate the new knowledge into our previously assembled knowledge structures (Ausubel et al., 1978).

Ausubel et al. (1978) and Bodner (1986) both highlight the advantages of probing student knowledge to subsequently guide learners in the next steps. Knowledge isn’t transferred from educator to learner as a complete and organized set, the learners themselves build their knowledge in their minds as they have new experiences and are introduced to new information; their meaningful learning is dependent on their ability to put the pieces together and order them in a way that makes sense with the world around them (Ausubel et al., 1978; Bodner, 1986).

As educators we strive to orchestrate environments, tasks and scenarios that assist learners in building their own knowledge based on their current and prior learning experiences. In addition to constructing knowledge, we also hope that the classroom environments and activities we facilitate promote students to make rich, multi-layered connections between different pieces of knowledge they’ve constructed. While we can create assessments to probe if learner knowledge constructs map on to ours satisfactorily or not—both the assembly process and the whole interconnected knowledge construct are invisible (Collins et al., 1991). Engaging with learners regularly and deeply in dialog about their thought processes and how they reason when problem-solving might afford space to encourage learners to find places where gaps in knowledge can be filled, or important connections within their knowledge web can be made.

In previously published studies where students shared about their oral exam experiences, the idea of the assessment feeling authentic emerged as a theme (Joughin, 1998). Kharkhurin (2014) developed four criteria for a work to be considered creative: (1) novelty, (2) utility, (3) aesthetic, and (4) authentic. While learners are (often) not yet using chemistry as tool to iterate and generate new knowledge—flexing their expertise and creativity—they are building their expertise in chemistry and identity as a chemist by exploring these dimensions independently through their courses, assessments and practice. Kharkhurin (2014) defines authenticity as honesty to the process and that is perceived in the output. Students recognized that having a conversation about their results was similar to how they might convey them as a professional chemist working somewhere else. It felt authentic.

Aesthetic deals with intentions and how those intentions are conveyed. When giving a presentation or interview, did you intend to appear as a chemist, were you perceived that way? As stated previously, neither written assessments nor scrutiny through timed, written assessments are not common beyond school; however, oral dialogs are extremely common—aesthetically chemistry. Oral exams may give an opportunity for students to explore their identities in an authentic and aesthetic manner as they continue to build their expertise and think creatively within their craft.

3 Learning environment

3.1 Course set-up

Due to the COVID-19 pandemic, both courses were online only. Dr. Burrows’ general chemistry course was modeled after a flipped classroom, students were provided videos with which to engage before synchronous class time. The videos were hosted on Play-Posit, which prompted students with questions to answer as they progressed through them to probe understanding. Synchronous meeting time consisted of targeted mini-lectures and problem-solving sessions. This class had weekly quizzes, and three oral exams.

Dr. Gaines’ organic chemistry course was asynchronous. Students were provided a series of video lectures and guided notes to fill in while watching the videos each week. In place of synchronous problem-solving sessions, Dr. Gaines hosted daily, elective, problem-solving sessions that students could attend synchronously if they wanted to; these sessions were recorded for students to access later. Students submitted weekly problem sets and had three oral exams. Dr. Gaines’ class had thirty students in the fall semester and twenty in the spring semester where oral exams were implemented. Dr. Burrows had twenty students in both semesters.

3.2 Design and structure

Both instructors designed the oral exams collaboratively. Each exam consisted of five multi-step, scaffolded questions that covered specific learning outcomes. Questions 1 and 2 covered the same learning outcomes as each other. Questions 3 and 4 also covered the same learning outcomes as each other. Question 5 covered unique learning outcomes from the previous questions—giving three sets of learning outcomes that can be covered in the exam. These exams were initially given as an open-resource (textbook, internet resources, notes), take-home assessment. Students had a week to work them before they were asked to sit the oral exam.

The take-home portions of the exams were distributed through each instructor’s learning management system (LMS). Dr. Burrows’ institution used Desire2Learn (D2L) and Dr. Gaines’ institution used Canvas. Students were required to upload a PDF copy of their worked exam to the LMS before they sat their exam. This PDF was not assessed; but was held for record-keeping purposes. This also afforded an opportunity for the instructor to prepare for the session by previewing how the student solved the problems. In Dr. Gaines’ course, students were encouraged to collaborate on the take-home portion of the assessment. Students in Dr. Burrows’ courses were not.

Students signed up for times they wanted to take their oral exam either via their LMS or via Calendly based on the thirty-minute timeslots the instructor created. The exams were hosted on Zoom during the scheduled time. Either the student could share their screen with the PDF of their exam, or the professor would share their screen showing the student’s PDF. This helped to facilitate conversation and gave a method to draw if needed.

During the oral exam, discussion centered around three of the five questions, and students selected the first question they wanted to go over. They were prompted to identify which one they felt most confident about; the instructor would pick the other two questions for discussion. Due to the structure of the learning outcomes in each exam question, all sets of learning outcomes could still be assessed, no matter which question students elected to go through first. The instructor would pick two questions to discuss that did not match their initial choice. On average, students finished their oral exam within twenty minutes, where prepared students tended to finish faster than students who were less prepared.

3.3 Rubrics

Rubrics were assessed on eight categories for each question in the oral portion for the organic chemistry course:

1. Overall understanding of the content

2. Communication

3. Valid structures (in their PDF)

4. Argument

5. Evidence

6. Reasonable answers

7. Calculations (Burrows) or curved arrows (Gaines)

8. Prompting (mistakes)

The students were scored in each category as exemplary (5 points), competent (4 points), developing (3 points) or emerging (0–2 points). This gave a maximum of 40 points per question, and 120 points for the exam. In Dr. Gaines’ course, this was modified after students expressed concern that the scale was too punishing. After discussing with the class, the following was implemented: exemplary (5 points), competent (4.25 points), developing (3.5 points) or emerging (0–2 points).

Another modification implemented in Dr. Gaines’ course was a “Redo” system. Where students could choose to retake any single oral assessment during the semester to replace the initial grade. All aspects of the assessment were the same as the first attempt.

3.4 Keeping receipts/documentation

McCloud (2023) wrote a great piece on keeping evidence in an ungraded class; but, this advice can be applied more broadly when it comes to implementing alternative, non-traditional assessments. Grades are the currency by which students get internships, jobs, scholarships, placements in medical schools or graduate schools. Alternative assessments like oral exams may scare students; they may be anxious as to whether they can successfully collect the currency they want to leverage for later (McCloud, 2023).

For underrepresented or historically excluded faculty, this student anxiety and uncertainty may reflect negatively upon us in teaching evaluations; therefore, keeping evidence is necessary protective documentation when implementing non-traditional pedagogical practices as we may not always receive the benefit of the doubt in extenuating circumstances (McCloud, 2023).

Both instructors kept record of evidence by:

1. Using a standardized rubric for every oral exam that students had access to before the assessment opened (permanently posted in the LMS)

2. Assigning students to complete the assessment as a take-home assessment first (PDF uploaded to the LMS)

3. Requiring students to upload their finished take-home assessment as a PDF into the course LMS before sitting the oral portion

4. Explicitly permitting and encouraging students to refer to their PDF to guide the explanation in the oral exam (instructions in LMS)

5. Using the rubric as the scorecard and giving it to students as part of their feedback (also through the LMS).

For both instructors, it was imperative to use the LMS as much as possible as that created a record of what materials were made available, what emails were sent, and what appointments were created. It is an independent record that also tracks changes, which could be used in case of any dispute. All details regarding the design of the course and rubrics use can be found in the Supplementary material.

4 Results to date and assessment

4.1 Course outcomes

The information expressed in this section is anecdotal in the absence of IRB approval. Overall, in both general chemistry and organic chemistry, students performed well on the oral exams compared to traditional teaching methods pre-COVID. However, some students struggled in ways that could be unique to the oral exam format. Without further study, it is impossible to parse if these difficulties were due to the format, or if they were, in part, due to remote instruction during the pandemic. During the oral exams, students initially struggled with how to prepare effectively; students typically required considerable prompting to articulate mechanistic details of reactions, and this aspect saw slow improvement over the course of the year. Although many students were able to arrive at answers through conversation, some prompting was necessary to elicit the relevant information from them. Overall, while the oral exams yielded better course outcomes than the prior teaching methods, we would be willing to investigate how they boost understanding or how they magnify areas of struggle.

4.2 Academic integrity

During the oral exam, students are responsible for demonstrating their own knowledge by walking the instructor through how they solved the exam problems. In such instances, it is hard for a student to pretend to know something they don’t understand when an expert is asking questions. There were no instances of academic dishonesty in either set of courses.

4.3 Future work and areas of interest

In the face of remote instruction due to COVID-19, the method of oral examination was appealing as it provided a structure to interact one-on-one with learners in our online chemistry courses. This format was implemented without the intention to study the impacts of oral exams on learners. Now, post-implementation, and post-reflection, we would like to discuss our plans to collect data when implementing these assessments again.

To contrast each cohort between traditional and oral assessments, we would collect scores on comparable assessments for both. This would be useful to support our observation that students performed well. In addition to grades on assessments, investigating how students prepare for their assessment would be valuable to understanding what is fueling any differences between oral and traditional performance. Similarly, probing into student anxiety pre- and post-assessment could be collected. The latter two items via survey. Lastly, if time permits, interviewing students to find out their experiences in a class that uses oral assessments and about their experiences during the assessment would be valuable to document.

5 Discussion on the practical implications, objectives and lessons learned

5.1 Professor experience—Nikita

Implementing oral exams in my chemistry courses was a rewarding experience that brought several advantages. Collaborating with another professor to troubleshoot and strategize created a supportive community of practice, where we could bounce ideas off each other and tackle any issues that arose. This collaborative effort eased the process of incorporating oral exams into the curriculum and reinforced our commitment to this assessment method. Moreover, the collaborative student environment fostered through the Zoom classroom was highly interactive and responsive. My familiarity with breakout rooms encouraged active engagement, and my students embraced the opportunity to work on oral exams independently, seeking my assistance during office hours when needed. The one-on-one interactions during the oral exams allowed for deeper insights into students’ knowledge and comprehension levels. In the context of general chemistry, the assessment format compelled students to confront the concepts directly, as they couldn’t rely solely on mathematical calculations to mask their understanding. It provided a comprehensive evaluation of their communication abilities across various domains.

The formative and summative aspects of the oral exams complemented each other, creating a holistic assessment approach. Although the exams were summative in nature, the immediate feedback provided during the oral sessions was invaluable to students. Unlike traditional written exams where feedback might go unread, students appreciated understanding why points were deducted and how they could improve. This enhanced their learning experience and motivated them to actively engage with their feedback. While the time investment in conducting oral exams was significant, the payoff was substantial. Students demonstrated a deeper grasp of the material, and some even reached out to me after moving to other schools, seeking my guidance based on their positive oral exam experience.

As for the continuation of this assessment method, I regrettably had to discontinue it for my multi-section classes. The consistency and fairness in assessments across sections demanded uniformity, making it impractical to implement oral exams in this context. However, I firmly believe that for smaller classes or online formats, oral exams can continue to be an effective means of assessing student comprehension and enhancing their learning journey. The interactive and personalized nature of oral exams not only provides valuable insights into student understanding but also instills a sense of accountability and responsibility for their own learning. As educators, we should continuously explore and adapt assessment approaches to best serve our students’ academic and intellectual growth.

5.2 Professor experience—Theresa

In my experience, I would be willing to try oral exams again. I think the struggle of overcoming logistical issues is worth the growth and conversations that oral exams afford. My organic classes at the time were online and asynchronous. The oral exam format provided a synchronous, face-to-face candid experience with each of my students that my traditional, in-person classrooms would still benefit from. Increasingly, students in my classrooms value collaboration. For them and their peers, willingness to help others who might not understand the material feels integral to their integrity and their character. Most instances of academic dishonesty that I’ve experienced in my classrooms stems from collaboration as a core value. My solution, through oral exams, was to give space for that collaboration.

In the oral exam setting, students had a chance to show what they’ve learned, but it was not at the cost of preventing collaboration with others if that’s what they wanted to do. For students, this process illuminated if their collaborative efforts were lacking or counter-productive. In study groups, it is easy to hide behind someone else’s explanation, but in the oral exam, each student leads the conversation. Several of my students realized through this experience that they weren’t leaving their collaborative sessions with anything but answers; that wasn’t sufficient for our oral exam which asked them to re-trace the process of deducing an answer. In written exams, it is difficult to give someone the benefit of the doubt about how much they understand if the answer is ambiguous. In oral exams, we can explore to find if and where a gap in knowledge exists and address it. It was a convenient place to apply just-in-time teaching—as there would be future assessments that our conversation would be applicable to.

As the instructor, it was beneficial to have Dr. Burrows with me. When I was stuck or if I ran into roadblocks, it was nice to talk to her and discuss if there was a bigger issue in our implementation or if there was something specific to my class and my students. Currently, I am not using oral exams. Initially when we went back to in-person classes, figuring out when and how to schedule the oral exams was an issue that I couldn’t resolve before the semester started. Now I’m at a new institution and I am teaching a brand-new set of courses. I would prefer to become more familiar with these courses before changing how the exams are conducted.

5.3 Perceived student experiences

In the initial stages of the oral assessment implementation, students exhibited apprehension and nervousness, as many had never encountered such a form of evaluation before. However, as they experienced the process firsthand, their perceptions evolved positively. Students found that the oral assessment was not as intimidating as they initially thought and realized that it provided a unique opportunity to showcase their understanding of the subject matter. This phenomenon was previously described in the literature (Burrows et al., 2021). The shift in students’ questions during the assessment was notable, moving from concerns about whether they had done the task correctly to inquiries about their conceptual comprehension and problem-solving approach. Students also felt comforted in their ability to choose the question they were most confident in as the first question we discussed. This change in focus indicated a deeper engagement with the material and a desire to demonstrate their true understanding rather than regurgitation of facts. To mitigate some of the initial student apprehension, several strategies were employed, such as offering the exam as a take-home version initially to acclimatize students to the format, allowing for redos to encourage learning from mistakes, and implementing policies such as dropping the lowest test score to alleviate pressure and foster a growth-oriented mindset. These measures helped create a supportive environment, fostering student confidence and active participation during the oral assessment.

5.4 Overall take-aways

One of the largest benefits that oral exams afforded was the ability and place to have a candid conversation with a student about where they’re excelling in their understanding and where they need extra support. This requires a particular faculty mindset, but with the option to revise their work and retake a limited number of oral exams, this can be a powerful iterative process founded in care. This also tended to be a reality check with students who were working with their classmates to realize that they didn’t fully understand the material. They could rethink how they want to study and if there’s an individual component that works well to supplement the collaborative portions of preparing for the oral exams.

Both professors found that engaging in a community of practice related to oral exams while implementing them was invaluable but agree that in a face-to-face course, as it would be difficult to replicate for classes larger than fifteen students without extra support. The biggest limitation to this type of assessment is the time required. While remote due to COVID-19, we had a lot of flexibility in our assignment times; reproducing this in a face-to-face class isn’t impossible but would require greater flexibility and attention to detail with the time commitments and scheduling.

6 Acknowledgment of constraints

6.1 Institutional regulations

We would like to acknowledge upfront that there are significant limitations that might preclude others from implementing oral exams in their courses. Firstly, some institutions or departments might have regulations that dictate how exams and assessments must be conducted. This style of assessment might be out of compliance with those regulations. Class size and structure might be another limitation to if this type of assessment.

6.2 COVID-19 pandemic

One constraint faced by Dr. Gaines was that their institution prohibited the use of synchronous formats for courses conducted remotely. Their institution was in a remote, rural area and while remote, most students did not have reliable internet access at home. Problem-solving sessions were available every day, but students were not required to attend.

6.3 Logistics and time considerations

Setting up one-on-one meeting times was done either through the Canvas’ scheduling feature or through Calendly. To find the time to schedule these meetings, synchronous sessions were dismissed for the exam period; similarly, lab periods were used to schedule sessions. If a student wanted to meet outside those times, that could also be arranged depending on the instructor’s schedule. These exams were conducted in online courses, so time was more flexible than in traditionally in-person synchronous classes.

Data availability statement

The original contributions presented in this study are included in this article/Supplementary material, further inquiries can be directed to the corresponding author.

Author contributions

TG: Conceptualization, Methodology, Project administration, Writing – original draft, Writing – review & editing. NB: Conceptualization, Methodology, Project administration, Writing – original draft, Writing – review & editing.

Funding

The authors declare that no financial support was received for the research, authorship, and/or publication of this article.

Acknowledgments

We would like to acknowledge and thank our students who were gracious and worked with us as we tried something new while navigating a pandemic.

Conflict of interest

The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

Publisher’s note

All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article, or claim that may be made by its manufacturer, is not guaranteed or endorsed by the publisher.

Supplementary material

The Supplementary Material for this article can be found online at: https://www.frontiersin.org/articles/10.3389/feduc.2024.1379886/full#supplementary-material

References

Ausubel, D. P., Novak, J. D., and Hanesian, H. (1978). Educational psychology: A cognitive view, 2d Edn. New York, NY: Holt, Rinehart and Winston.

Google Scholar

Bodner, G. M. (1986). Constructivism: A theory of knowledge. J. Chem. Educ. 63:873. doi: 10.1021/ed063p873

Crossref Full Text | Google Scholar

Boedigheimer, R., Ghrist, M., Peterson, D., and Kallemyn, B. (2015). Individual oral exams in mathematics courses: 10 years of experience at the air force academy. PRIMUS 25, 99–120. doi: 10.1080/10511970.2014.906008

Crossref Full Text | Google Scholar

Bowen, C. W. (1994). Think-aloud methods in chemistry education: Understanding student thinking. J. Chem. Educ. 71:184. doi: 10.1021/ed071p184

Crossref Full Text | Google Scholar

Burrows, N. L., Ouellet, J., Joji, J., and Man, J. (2021). Alternative assessment to lab reports: A phenomenology study of undergraduate biochemistry students’ perceptions of interview assessment. J. Chem. Educ. 98, 1518–1528. doi: 10.1021/acs.jchemed.1c00150

Crossref Full Text | Google Scholar

Collins, A., Brown, J., and Holum, A. (1991). COGNITIVE APPRENTICESHIP: MAKING THINKING VISIBLE. Am. Educ. Prof. J. Am. Fed. Teach. 15, 6–46.

Google Scholar

Cotten, S. R., and Wilson, B. (2006). Student–faculty interactions: Dynamics and determinants. High. Educ. 51, 487–519. doi: 10.1007/s10734-004-1705-4

Crossref Full Text | Google Scholar

Crawford, G. L., and Kloepper, K. D. (2019). Exit interviews: Laboratory assessment incorporating written and oral communication. J. Chem. Educ. 96, 880–887. doi: 10.1021/acs.jchemed.8b00950

Crossref Full Text | Google Scholar

Davis, M. H., and Karunathilake, I. (2005). The place of the oral examination in today’s assessment systems. Med. Teach. 27, 294–297. doi: 10.1080/01421590500126437

PubMed Abstract | Crossref Full Text | Google Scholar

Díaz, R. J., Glass, C. R., Arnkoff, D. B., and Tanofsky-Kraff, M. (2001). Cognition, anxiety, and prediction of performance in 1st-year law students. J. Educ. Psychol. 93, 420–429. doi: 10.1037/0022-0663.93.2.420

Crossref Full Text | Google Scholar

Dicks, A. P., Lautens, M., Koroluk, K. J., and Skonieczny, S. (2012). Undergraduate oral examinations in a university organic chemistry curriculum. J. Chem. Educ. 89, 1506–1510. doi: 10.1021/ed200782c

Crossref Full Text | Google Scholar

Flood, V. J., Amar, F. G., Nemirovsky, R., Harrer, B. W., Bruce, M. R. M., and Wittmann, M. C. (2015). Paying attention to gesture when students talk chemistry: Interactional resources for responsive teaching. J. Chem. Educ. 92, 11–22. doi: 10.1021/ed400477b

Crossref Full Text | Google Scholar

Gent, I., Johnston, B., and Prosser, P. (1999). Thinking on your feet in undergraduate computer science: A constructivist approach to developing and assessing critical thinking. Teach. High. Educ. 4, 511–522. doi: 10.1080/1356251990040407

Crossref Full Text | Google Scholar

Gilbert, J. K., and Treagust, D. F. (2009). “Introduction: Macro, submicro and symbolic representations and the relationship between them: Key models in chemical education,” in Multiple representations in chemical education, eds J. K. Gilbert and D. Treagust (Dordrecht: Springer Netherlands), 1–8. doi: 10.1007/978-1-4020-8872-8_1

Crossref Full Text | Google Scholar

Goodman, A. L. (2020). Can group oral exams and team assignments help create a supportive student community in a biochemistry course for nonmajors? J. Chem. Educ. 97, 3441–3445. doi: 10.1021/acs.jchemed.0c00815

Crossref Full Text | Google Scholar

Gregory, S. E. A., and Jackson, M. C. (2017). Joint attention enhances visual working memory. J. Exp. Psychol. Learn. Mem. Cogn. 43, 237–249. doi: 10.1037/xlm0000294

PubMed Abstract | Crossref Full Text | Google Scholar

Hambrecht, G. (2003). Oral examinations: A new measure of learner success. Delta Kappa Gamma Bull. 69, 31–32. doi: 10.1002/ase.1252

PubMed Abstract | Crossref Full Text | Google Scholar

Henderson, J., Lloyd, P., and Scott, H. (2002). “In the real world we’re all put on the spot at some time or other, so you need to be prepared for it”: An exploratory study of an oral method of assessing knowledge of mental health law. Soc. Work Educ. 21, 91–103. doi: 10.1080/02615470120107040

Crossref Full Text | Google Scholar

Heyneman, S. P., Anderson, K. H., and Nuraliyeva, N. (2008). The cost of corruption in higher education. Comp. Educ. Rev. 52, 1–25. doi: 10.1086/524367

Crossref Full Text | Google Scholar

Huxham, M., Campbell, F., and Westwood, J. (2012). Oral versus written assessments: A test of student performance and attitudes. Assess. Eval. High. Educ. 37, 125–136. doi: 10.1080/02602938.2010.515012

Crossref Full Text | Google Scholar

Iannone, P., and Simpson, A. (2015). Students’ views of oral performance assessment in mathematics: Straddling the ‘assessment of’ and ‘assessment for’ learning divide. Assess. Eval. High. Educ. 40, 971–987. doi: 10.1080/02602938.2014.961124

Crossref Full Text | Google Scholar

Joughin, G. (1998). Dimensions of oral assessment. Assess. Eval. High. Educ. 23, 367–378.

Google Scholar

Joughin, G. (2010). A short guide to oral assessment. Leeds: Leeds Met Press in association with University of Wollongong.

Google Scholar

Kharkhurin, A. V. (2014). Creativity.4in1: Four-criterion construct of creativity. Creat. Res. J. 26, 338–352. doi: 10.1080/10400419.2014.929424

Crossref Full Text | Google Scholar

Kiste, A. L., Hooper, R. G., Scott, G. E., and Bush, S. D. (2016). Atomic tiles: Manipulative resources for exploring bonding and molecular structure. J. Chem. Educ. 93, 1900–1903. doi: 10.1021/acs.jchemed.6b00361

Crossref Full Text | Google Scholar

Kozma, R., and Russell, J. (2005). “Students becoming chemists: Developing representationl competence,” in Visualization in science education, ed. J. K. Gilbert (Dordrecht: Springer Netherlands), 121–145.

Google Scholar

Lubarda, M., Delson, N., Schurgers, C., Ghazinejad, M., Baghdadchi, S., Phan, A., et al. (2021). “Oral exams for large-enrollment engineering courses to promote academic integrity and student engagement during remote instruction,” in Proceedings of the IEEE Frontiers education conference FIE, (Lincoln, NE), doi: 10.1109/fie49875.2021.9637124

Crossref Full Text | Google Scholar

McCloud, L. I. (2023). Keeping receipts: Thoughts on ungrading from a black woman professor. Zeal J. Lib. Arts 1, 101–105.

Google Scholar

Muldoon, J. A. (1926). The value of oral examinations. J. Chem. Educ. 3:773. doi: 10.1021/ed003p773

Crossref Full Text | Google Scholar

Orgill, M., and Sutherland, A. (2008). Undergraduate chemistry students’ perceptions of and misconceptions about buffers and buffer problems. Chem. Educ. Res. Pract. 9, 131–143. doi: 10.1039/B806229N

Crossref Full Text | Google Scholar

Paul, A. M. (2021). The extended mind: The power of thinking outside the brain. Boston, MA: Houghton Mifflin Harcourt.

Google Scholar

Ramella, D. (2019). “Oral exams: A deeply neglected tool for formative assessment in chemistry,” in Proceedings of the active learning in general chemistry: Specific interventions ACS symposium series, (Washington, DC: American Chemical Society), 79–89. doi: 10.1021/bk-2019-1340.ch006

Crossref Full Text | Google Scholar

Roecker, L. (2007). Using oral examination as a technique to assess student understanding and teaching effectiveness. J. Chem. Educ. 84:1663. doi: 10.1021/ed084p1663

Crossref Full Text | Google Scholar

Shteynberg, G., and Apfelbaum, E. P. (2013). The power of shared experience: Simultaneous observation with similar others facilitates social learning. Soc. Psychol. Pers. Sci. 4, 738–744. doi: 10.1177/1948550613479807

Crossref Full Text | Google Scholar

Theobold, A. S. (2021). Oral exams: A more meaningful assessment of students’ understanding. J. Stat. Data Sci. Educ. 29, 156–159. doi: 10.1080/26939169.2021.1914527

Crossref Full Text | Google Scholar

Tienson-Tseng, H. L. (2019). “Best practices in summative assessment,” in Proceedings of the biochemistry education: From theory to practice ACS symposium series, (Washington, DC: American Chemical Society), 219–243. doi: 10.1021/bk-2019-1337.ch010

Crossref Full Text | Google Scholar

Walker, J. P., and Sampson, V. (2013). Learning to argue and arguing to learn: Argument-driven inquiry as a way to help undergraduate chemistry students learn how to construct arguments and engage in argumentation during a laboratory course. J. Res. Sci. Teach. 50, 561–596. doi: 10.1002/tea.21082

Crossref Full Text | Google Scholar

Waterfield, J., and West, B. (2005). Inclusive assessment in higher education: A resource for change. Plymouth: University of Plymouth.

Google Scholar

Keywords: oral exam, alternative assessments, organic chemistry, general chemistry, instructor experience

Citation: Gaines T and Burrows NL (2024) Exploring alternative assessments during COVID: Instructor experiences using oral exams. Front. Educ. 9:1379886. doi: 10.3389/feduc.2024.1379886

Received: 31 January 2024; Accepted: 29 April 2024;
Published: 16 May 2024.

Edited by:

Jay Wackerly, Central College, United States

Reviewed by:

Grace Ferris, Lesley University, United States
Melonie Teichert, United States Naval Academy, United States

Copyright © 2024 Gaines and Burrows. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

*Correspondence: Theresa Gaines, tgaines@haverford.edu

These authors have contributed equally to this work

Disclaimer: All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article or claim that may be made by its manufacturer is not guaranteed or endorsed by the publisher.