- Prosthodontic Department, College of Medicine and Dentistry, Riyadh Elm University, Riyadh, Saudi Arabia
The Objective Structured Clinical Examination (OSCE) is a performance-based assessment intended to assess medical students' clinical competency in a simulated, standardized environment. Because it measures the student's ability to use clinical knowledge, diagnostic skill, and decision-making, the OSCE is thought to be more objective than traditional tests. OSCE exams have been increasingly employed in dentistry schools, particularly in the last decade, and it is crucial to investigate instructors' and dental students’ experiences with this evaluation approach.
Introduction
The Objective Structured Clinical Examination (OSCE) is a performance-based exam developed by Harden in the 1970s to assess students' competencies in relation to knowledge, skills, and attitudes in actual or simulated standardized situations (1). The OSCE is thought to be more objective than traditional exams. Rather than testing knowledge only, the OSCE evaluates the student's ability to use clinical knowledge, diagnostic skills, and decision making (2).
The evaluation of students' competency in areas such as communication, problem-solving, and decision-making can be challenging. However, OSCEs offer significant advantages in assessing these areas compared to other types of clinical exams, with higher reliability, validity, and objectivity (3).
OSCE tests are increasingly being utilized in dental schools. Moreover, many of the dental practice regulatory bodies are adopting OSCE as a method of assessment for the licensure examination (4).
The purpose of this article is to review the existing literature on the use of OSCE exams in dental schools and the benefits they provide in measuring clinical competence. A PubMed search was undertaken to find literature on the usage of OSCE exams in dental schools. “OSCE, dentistry, dental education” were the keywords used in the search and the search was restricted to English-language papers.
The use of OSCE in dental schools
OSCE exams have been found to be effective in identifying the strengths and weaknesses of dental students and providing feedback to help them improve their clinical skills. Typically, the exams are designed to examine clinical skills and knowledge in a controlled and standardized setting. The OSCE in dentistry is designed to examine students' competencies in a variety of areas, including knowledge, skills, clinical reasoning, anamnesis, communication (both verbal and written), oral health education, and patient interaction (5).
OSCEs have been found to offer various benefits in dentistry schools, including better inter-examiner reliability, standardized evaluation, and the capacity to assess a wide range of clinical abilities. They ensure that all students are assessed on the same set of competencies and that the assessment is objective and fair. Furthermore, OSCE tests provide students with a safe and controlled setting in which to practice their clinical skills and acquire confidence in their abilities (6).
Previous research has found a link between OSCE scores and clinical and didactic performance, lending credence to the utility of OSCEs as a form of assessment (7).
OSCE tests are used in several dental schools across the world. The experience of dentistry schools in administering OSCE exams varies greatly depending on the school's resources and skills. Some schools may have specialized employees or departments to supervise the design and administration of OSCE tests, whereas others may rely on faculty or other staff members (5, 8).
A common method for administering an OSCE exam in a dentistry school is to set a series of stations, each with a particular assignment or scenario for the student to accomplish within a certain amount of time. The situations are reviewed by qualified examiners who grade students on specified criteria (9).
Standardized simulated patients are also used in some dentistry schools' OSCE tests. Standardized patients are trained actors who portray real patients with certain diseases or concerns. These patients can provide the student with a realistic and standardized experience. Simulated patients may comprise both actors and non-teaching university personnel. They are trained to play specific roles and employ scripts that have been produced for each station. To guarantee the objectivity of the test, the scripts must be extensive and comprehensive, and their confidentiality and security must constantly be ensured. Simulated patients must behave consistently with all students, including the emotions they express during the test, in order to provide a fair and objective assessment of students' skills. Actors must be prepared and trained in order to standardize their involvement, which is critical for evaluating not only the student's examination skills but also their interpersonal skills (9, 10).
The administration of OSCE adheres to fundamental principles and allows flexibility in its implementation based on targeted learning outcomes and tested competencies. The design of OSCE levels, the selection of standard cases or situations, and the assessment criteria can be tailored to emphasize specialized clinical skills or competencies related to one or more learning domains (10). The review of the literature showed that OSCE exams are regarded as an effective tool for assessing clinical competence in different dental specialties, such as restorative dentistry, prosthodontics, endodontics, periodontics, oral surgery, oral medicine, oral radiology, orthodontics, and pediatric dentistry (10–15).
For instance, in one published OSCE scenario in periodontology, the 21-year-old actress played the role of a complex patient presenting with necrotizing gingivitis (NG). The patient was given a script to follow, and prior training to clarify her doubts and establish a pattern that was realistic. The teachers assessed the students' reaction and management of this situation. Students were graded on communication skills, diagnostic judgment, treatment strategies, preventive measures, and ethical considerations. They were carefully evaluated for their performance on a 47-point scale checklist, with students needing at least 50% of the total marks to pass this challenging level with a comprehensive test of their ability to manage the patient and the case (10).
Another example of the use of OSCE in dental schools is reported by Fields et al. (14) who introduced OSCE to an advanced orthodontic education program to evaluate its impact on the curriculum. To formulate the examination's content, 60 orthodontic practitioners were contacted to assess which clinical skills are essential for an entry-level practitioner to possess. Thirteen of the eighteen crucial clinical skills were assessed by the OSCE in the domains of orthodontic technique, clinical evaluation and synthesis, and diagnosis (14).
Höfer et al., (15) introduced OSCE as a method of assessment in a problem-based learning (PBL) curriculum. Students of Cranio-Maxillo-Facial Surgery practical course were presented with clinical scenarios at 10 stations as they rotated through them. These stations included trauma (managing a fractured mandible or malar bone, managing intraoral or extraoral bleeding), practical knowledge (ligature of 8, probe biopsy, intravenous line, ligature with wire, realigning a mandible that had become dislocated), diagnosis (describing intraoral tumors, x-ray, CT-scan, craniofacial examination), and oncology (describing intraoral tumours, for example, excision). An examiner used a standardized multi-item checklist to observe and evaluate the students' performance during each five-minute station (15).
In dental schools, OSCE can be utilized to assess competence in a specific subject as well as multiple domains and specialties. Assessing several domains at each station was reported to be more realistic and helped students understand the relevance of OSCE (13). An example is reported by Manogue and Brown. They prepared a total of 17 stations in the areas of conservative dentistry, periodontology and prosthodontics. Across the subjects, 9 stations tested knowledge, 4 tested procedures, 5 tested clinical reasoning, 3 tested history taking, 4 tested communication and 4 tested techniques. One station also tested the provision of oral health education. 15 stations had 5-minute tasks and the remaining 2 had 10-minute tasks. 4 stations involved the incorporation of simulated patients, e.g., where a medical history was to be taken (13).
Checklists were previously used to assess student performance in tasks such as problem-solving, but this approach has drawbacks because it simply allows examiners to watch rather than analyze behaviors. Rating scales that use a Likert-type scale to consider performance across multiple areas have been advocated as a more valid technique for analyzing candidate behaviors. However, training examiners to use these scales can take time, thus e-learning aids such as videos and training packages may be a more practical option (16).
In addition to the actual administration of the OSCE exam, dental schools may also invest significant resources in the development of the exam. This may include the creation of standardized scenarios or cases, the development of assessment rubrics, and the training of examiners (17).
The findings of OSCE can offer educationists with useful information when establishing clinical dentistry education programs. Some dental schools now include comments from OSCE exams into their curriculum. Students may receive a thorough report on their performance, including strengths and shortcomings, following the exam to assist them in identifying areas for improvement. Students and professors found a combination of written and auditory feedback approaches to be more beneficial (18). The feedback offered should characterize test performance based on the core domains of learning that the test attempts to examine, as well as include the resources needed to implement individualized improvements at an individual level. This form of feedback is very beneficial to students because it allows them to focus their learning efforts on areas that require development (19).
Student experiences with OSCE
Many dentistry schools have students evaluate their OSCE tests. This feedback is utilized to improve the exam's design, delivery, and evaluation. Students frequently seek more time, better instructions, and more opportunity for feedback and guidance from faculty (9).
Student opinions and experiences with OSCE exams in dental schools vary. Overall, student feedback has been positive, with many students recognizing the value of the OSCE in assessing clinical skills and appreciating the practical approach of OSCE exams, which tests their ability to apply theoretical knowledge in a clinical setting (19). The structured format of the OSCE exams also ensures that all students are tested on the same skills and given equal opportunities to showcase their skills and knowledge. Several studies have reported that majority of dental undergraduate respondents stated they would choose to have a similar format of OSCE assessment in future which reflects student's support for this type of assessment (8).
Students from different dental schools reported that they felt OSCE went beyond fact memorization, required knowledge application, demanded critical thinking and problem-solving abilities, and was an authentic educational experience (20).
Some students, however, found the OSCE exams scary, unpleasant, and anxiety-inducing, especially when there are multiple stations with limited time to complete each task. The timed, interactive components of the OSCE as well as the examiners' continuous observation and monitoring may be the cause of the higher anxiety levels that have been reported. According to one the study, there is no reason to stop using OSCEs as an assessment tool for dental undergraduates, because the reported levels of anxiety among students did not predict their OSCE results (21).
Furthermore, some students thought that the scenarios did not always accurately reflect the clinical situations they encountered in practice. This seems to be largely affected by the type of OSCE stations. Students reported that the use of Phanom heads and acrylic models to simulate the chairside clinical situation have many drawbacks including the lack of clinical authenticity, and lack of communication skills testing. Simulated patients help to overcome these drawbacks. Nevertheless, simulated patients scenarios are usually limited to non-invasive procedures which does not test manual dexterity and is not effective in assessing clinical operative skills (22).
Some students also believe that the OSCE exams do not reflect their true ability because they are timed and do not always allow for the creativity and flexibility required in real-life clinical settings (23). Combining OSCE with other assessment methods was recommended to overcome these drawbacks and improve assessment validity and student's learning experience (16, 24).
OSCE's limitations
Despite the numerous advantages of OSCE exams, there are some drawbacks that must be addressed. One of the most significant constraints is the expense and time required to put up an OSCE exam. Another limitation is the possibility of students memorizing the situations and becoming disinterested in the exam. Furthermore, OSCE exams may not adequately depict the complicated nature of clinical practice or assess students' competence to engage with real patients (25).
Simulated patient OSCE stations can be challenging due to several logistical issues. These are stations that take a great deal of planning and preparation: choosing and training the actor, setting up a large area with multiple rooms, calibrating the rating system, hiring support staff to operate the computer, and creating any additional tests that might be required (x-rays, periodontal charts, study models, etc.) (13).
Some educators have expressed worry that the fragmentation of complex clinical situations into short OSCE stations could lead to a loss of validity. In order to achieve thorough and efficient assessment, educators investigated the benefit of combining OSCE with other assessment techniques. Some authors suggested that combining the Objective Structured Clinical Examination (OSCE) with other assessment methods such as Direct Observation of Procedural Skills (DOPs) and Mini-Clinical Evaluation Exercise (mini-Cex) can be an effective and feasible evaluation tool for clinical dentistry. DOPs assesses the students' operational abilities, while mini-Cex evaluates their diagnosis and treatment abilities in real-world clinical settings (24, 26).
Some quantitative studies show that examiners frequently differ in their OSCE scores. This could pose a significant risk to the reliability of assessment results and compromise the validity of the entire testing procedure (27, 28). To reduce this risk and ensure the validity of exam results, candidates should complete a sufficient number of stations, particularly with multiple examiners. Examiner differences could be lessened by rigorous preparations, training and displaying various scoring patterns (29).
Students frequently complain about a lack of preparation for OSCE tests. While students' OSCE preparation resources are contradictory, alternative activities such as pro-OSCEs with peer feedback and the use of online resources, including serious games, are growing. Serious games are non-recreational games that have been shown to be at least as successful as more traditional resources for enhancing knowledge, skills, and overall satisfaction. Because of its immersive experience and round-the-clock availability, serious gaming may be useful for OSCE preparation (30).
Virtual OSCE
The Remote Objective Structured Clinical Examination, or Virtual OSCE, uses telecommunication technologies to perform clinical assessments remotely. During the COVID-19 pandemic, the use of virtual OSCE has become increasingly important for evaluating dental students’ clinical competencies while adhering to social distancing guidelines (31). The exam is conducted online via video conferencing software, with students interacting with standardized patients in another location. The standardized patients could be actors or genuine patients who have been trained to imitate clinical events, giving students the opportunity to demonstrate their knowledge, skills, and patient contact abilities in a realistic context. The exam is conducted using defined methods and assessment tools to assure the validity and reliability of the remote OSCE. The utilization of digital tools like high-resolution cameras and real-time video conferencing software also contributes to a more realistic and immersive experience for both students and standardized patients (32–34).
Remote OSCEs can be very useful for assessing students' communication skills with patients, which is an important feature of dental practice. Furthermore, they can help to reduce the logistical challenges and costs associated with traditional in-person clinical exams, such as the need for physical space and standardized patients (35, 36).
Virtual OSCE was well-received by both students and examiners and found it to be as engaging and as interactive as in-person teaching. Current evidence showed moderate agreement of virtual OSCE with on-site clinical assessments. However, this was based on studies with low methodological quality small sample sizes. Thus, additional research is necessary to further investigate the efficacy and validity of virtual OSCE in medical teaching including dentistry (37).
Peer OSCE
A Peer OSCE is a technique used to evaluate medical students and healthcare workers by their peers. It entails watching and giving comments to colleagues as they conduct clinical activities on standardized patients. This method has a number of advantages, including the ability to receive constructive comments from others with similar knowledge and skill levels. This feedback can help trainees improve their clinical abilities and think critically about how they provide care to patients. Furthermore, the collaborative character of peer OSCEs can build a supportive learning atmosphere, promote teamwork, and boost learners' confidence (38).
Peer OSCE is also inexpensive as it needs few resources and is simple to include into existing courses. Overall, the benefits of peer OSCEs include feedback, improved learning, teamwork, increased confidence, and cost-effectiveness. Because of these advantages, peer OSCEs have the potential to be a beneficial tool for measuring and enhancing clinical skills among dental students (39).
One study compared the perceived value of dental students' feedback on their performance in OSCE from either a faculty member or a peer student. Students perceived value in the feedback from both faculty and peers and believed it enhanced their skills. However, students rated faculty feedback significantly higher. Peer feedback on nontechnical clinical competency assessments could be a valuable tool for the learning environment, though not as a substitute for faculty feedback (39).
To obtain the desired benefits of peer feedback, the students need a lot of training on the administration of OSCE with emphasis on how to give constructive feedback in a positive environment. Feedback given in a negative way may influence the student's performance in subsequent stations (40).
Conclusion
In conclusion, various research and reports have demonstrated that OSCEs are an effective technique for evaluating clinical competency in dental education. They offer a dependable and valid way for assessing students' clinical skills and identifying areas for improvement. The use of OSCE exams in dentistry schools ensures that students are evaluated in a standardized and objective manner, as well as that students can practice their clinical skills in a safe and controlled environment. Although there are certain difficulties to using OSCE tests, the benefits greatly exceed the disadvantages. To ensure the quality of dental education and foster the development of competent dental practitioners, dental schools should continue to incorporate OSCE assessments within their curriculum. Dental schools should also use student input to improve the design and delivery of their OSCE exams, ensuring that they are a useful tool for assessing students' clinical knowledge and skills.
Author contributions
MR: Conceptualization, Writing – original draft, Writing – review & editing.
Funding
The author declares that no financial support was received for the research, authorship, and/or publication of this article.
Conflict of interest
The author declares that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.
Publisher's note
All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article, or claim that may be made by its manufacturer, is not guaranteed or endorsed by the publisher.
References
1. Harden RM, Gleeson FA. Assessment of clinical competence using an objective structured clinical examination (OSCE). Med Educ. (1979) 13(1):41–54. doi: 10.1111/j.1365-2923.1979.tb00918.x
2. Abd-Rahman ANA, Baharuddin IH, Abu-Hassan MI, Davies SJ. A comparison of different standard-setting methods for professional qualifying dental examination. J Dent Educ. (2021) 85(7):1210–6. doi: 10.1002/jdd.12600
3. Martin RD, Naziruddin Z. Systematic review of student anxiety and performance during objective structured clinical examinations. Curr Pharm Teach Learn. (2020) 12(12):1491–7. doi: 10.1016/j.cptl.2020.07.007
4. Mills EA. Non-patient-based clinical licensure examination for dentistry in Minnesota: significance of decision and description of process. J Dent Educ. (2016) 80(6):648–51. doi: 10.1002/j.0022-0337.2016.80.6.tb06125.x
5. Burkert V, Stoykova M, Semerdjieva M. Communication skills teaching methods in dental education—a review. Folia Med. (2021) 63(1):30–4. doi: 10.3897/folmed.63.e52343
6. Al Nazzawi AA. Dental students’ perception of the objective structured clinical examination (OSCE): the taibah university experience, almadinah almunawwarah, KSA. J Taibah Univ Med Sci. (2017) 13(1):64–9. doi: 10.1016/j.jtumed.2017.09.002
7. Park SE, Anderson NK, Karimbux NY. OSCE and case presentations as active assessments of dental student performance. J Dent Educ. (2016) 80(3):334–8. doi: 10.1002/j.0022-0337.2016.80.3.tb06089.x
8. Puryer J, Neville P, Fowler E. Between fairness and fear-dental undergraduates’ attitudes towards objective structured clinical examinations. Eur J Dent Educ. (2019) 23(3):323–31. doi: 10.1111/eje.12433
9. Egloff-Juras C, Hirtz P, Luc A, Vaillant-Corroy AS. An objective structured clinical examination (OSCE) for French dental students: feedback after 2 years. Dent J. (2021) 9:136. doi: 10.3390/dj9110136
10. Cidoncha G, Muñoz-Corcuera M, Sánchez V, Pardo Monedero MJ, Antoranz A. The objective structured clinical examination (OSCE) in periodontology with simulated patient: the most realistic approach to clinical practice in dentistry. Int J Environ Res Public Health. (2023) 20(3):2661. doi: 10.3390/ijerph20032661
11. Aboalsaud KM, Nieto VK, Eagle IT, Rulli D. Dental hygiene educators’ knowledge and implementation of objective structured clinical examination in United States dental hygiene programs. J Dent Educ. (2023) 87(1):25–33. doi: 10.1002/jdd.13087
12. Nie R, Zhu F, Meng X, Zhang H, Xie S, Wu L, et al. Application of OSCE for stage assessment in standardized training for oral residents. J Dent Educ. (2018) 82(9):1000–6. doi: 10.21815/JDE.018.099
13. Manogue M, Brown G. Developing and implementing an OSCE in dentistry. Eur J Dent Educ. (1998) 2(2):51–7. doi: 10.1111/j.1600-0579.1998.tb00039.x
14. Fields HW, Rowland ML, Vig KW, Huja SS. Objective structured clinical examination use in advanced orthodontic dental education. Am J Orthod Dentofacial Orthop. (2007) 131(5):656–63. doi: 10.1016/j.ajodo.2007.01.013
15. Höfer SH, Schuebel F, Sader R, Landes C. Development and implementation of an objective structured clinical examination (OSCE) in CMF-surgery for dental students. J Craniomaxillofac Surg. (2013) 41(5):412–6. doi: 10.1016/j.jcms.2012.11.007
16. Moreno-López R, Sinclair S. Evaluation of a new e-learning resource for calibrating OSCE examiners on the use of rating scales. Eur J Dent Educ. (2020) 24(2):276–81. doi: 10.1111/eje.12495
17. Hsiang-Hua Lai E, Zwei-Chieng Chang J, Wang CY, Cheng YC, Tsai SL. Summative objective structured clinical examination as a reference of learners’ need before entering their internship. J Dent Sci. (2018) 13(4):350–3. doi: 10.1016/j.jds.2018.08.005
18. Wardman MJ, Yorke VC, Hallam JL. Evaluation of a multi-methods approach to the collection and dissemination of feedback on OSCE performance in dental education. Eur J Dent Educ. (2018) 22(2):e203–11. doi: 10.1111/eje.12273
19. Clarke A, Lai H, Sheppard A, Yoon MN. Development of diagnostic score reporting for a dental hygiene structured clinical assessment. Can J Dent Hyg. (2021) 55(1):48–56. 33643417; PMCID: PMC7906122.33643417
20. Graham R, Zubiaurre Bitzer LA, Mensah FM, Anderson OR. Dental student perceptions of the educational value of a comprehensive, multidisciplinary OSCE. J Dent Educ. (2014) 78(5):694–702. doi: 10.1002/j.0022-0337.2014.78.5.tb05721.x
21. Brand H, Schoonheim-Klein M. Is the OSCE more stressful? Examination anxiety and its consequences in different assessment methods in dental education. Eur J Dent Educ. (2009) 13:147–53. doi: 10.1111/j.1600-0579.2008.00554.x
22. Mossey PA, Newton JP, Stirrups DR. Scope of the OSCE in the assessment of clinical skills in dentistry. Br Dent J. (2001) 190(6):323–6. doi: 10.1038/sj.bdj.4800961
23. Puryer J. Dental undergraduate views of objective structured clinical examinations (OSCEs): a literature review. Dent J. (2016) 4(1):6. doi: 10.3390/dj4010006
24. Niu L, Mei Y, Xu X, Guo Y, Li Z, Dong S, et al. A novel strategy combining Mini-CEX and OSCE to assess standardized training of professional postgraduates in department of prosthodontics. BMC Med Educ. (2022) 22(1):888. doi: 10.1186/s12909-022-03956-w
25. Majumder MAA, Kumar A, Krishnamurthy K, Ojeh N, Adams OP, Sa B. An evaluative study of objective structured clinical examination (OSCE): students and examiners perspectives. Adv Med Educ Pract. (2019) 10:387–97. doi: 10.2147/AMEP.S197275
26. Hatala R, Marr S, Cuncic C, Bacchus CM. Modification of an OSCE format to enhance patient continuity in a high-stakes assessment of clinical performance. BMC Med Educ. (2011) 11:23. doi: 10.1186/1472-6920-11-23
27. Bartman I, Smee S, Roy M. A method for identifying extreme OSCE examiners. Clin Teach. (2013) 10(1):27–31. doi: 10.1111/j.1743-498X.2012.00607.x
28. Wood TJ, Chan J, Humphrey-Murto S, Pugh D, Touchie C. The influence of first impressions on subsequent ratings within an OSCE station. Adv Health Sci Educ Theory Pract. (2017) 22(4):969–83. doi: 10.1007/s10459-016-9736-z
29. Homer M. Towards a more nuanced conceptualisation of differential examiner stringency in OSCEs. Adv Health Sci Educ Theory Pract. (2023). doi: 10.1007/s10459-023-10289-w [Epub ahead of print].37843678
30. Germa A, Gosset M, Gaucher C, Valencien C, Schlumberger M, Colombier ML, et al. OSCEGame: a serious game for OSCE training. Eur J Dent Educ. (2021) 25(4):657–63. doi: 10.1111/eje.12643
31. Seifert LB, Coppola A, Diers JWA, Kohl C, Britz V, Sterz J, et al. Implementation and evaluation of a tele-OSCE in oral and maxillofacial surgery—a pilot report. GMS J Med Educ. (2022) 39(5):Doc50. doi: 10.3205/zma001571
32. Donn J, Scott JA, Binnie V, Bell A. A pilot of a virtual objective structured clinical examination in dental education. A response to COVID-19. Eur J Dent Educ. (2020) 25(3):488–94. doi: 10.1111/eje.12624
33. Sarmiento M, Corvus TS, Hunsinger M, Davis-Risen S, Chatnick PA, Bell K. Implementation of virtual interprofessional observed structured clinical encounters (OSCEs): a pilot study. J Allied Health. (2022) 51(4):e119–24. 36473227.36473227
34. Porto FR, Ribeiro MA, Ferreira LA, Oliveira RG, Devito KL. In-person and virtual assessment of oral radiology skills and competences by the objective structured clinical examination. J Dent Educ. (2023) 87(4):505–13. doi: 10.1002/jdd.13138
35. Hytönen H, Näpänkangas R, Karaharju-Suvanto T, Eväsoja T, Kallio A, Kokkari A, et al. Modification of national OSCE due to COVID-19—implementation and students’ feedback. Eur J Dent Educ. (2021) 25(4):679–88. doi: 10.1111/eje.12646
36. Donn J, Scott JA, Binnie V, Mather C, Beacher N, Bell A. Virtual objective structured clinical examination during the COVID-19 pandemic: an essential addition to dental assessment. Eur J Dent Educ. (2023) 27(1):46–55. doi: 10.1111/eje.12775
37. Kunutsor SK, Metcalf EP, Westacott R, Revell L, Blythe A. Are remote clinical assessments a feasible and acceptable method of assessment? A systematic review. Med Teach. (2022) 44(3):300–8. doi: 10.1080/0142159X.2021.1987403
38. McKay A, McCall J, Cairns AM. Peer assessment: development and delivery of the OSCE. Eur J Dent Educ. (2023) 27(2):234–9. doi: 10.1111/eje.12796
39. Andrews E, Dickter DN, Stielstra S, Pape G, Aston SJ. Comparison of dental students’ perceived value of faculty vs. Peer feedback on non-technical clinical competency assessments. J Dent Educ. (2019) 83(5):536–45. doi: 10.21815/JDE.019.056
Keywords: OSCE, dental education, assessment, dental students, exam
Citation: Rayyan MR (2024) The use of objective structured clinical examination in dental education- a narrative review. Front. Oral. Health 5:1336677. doi: 10.3389/froh.2024.1336677
Received: 11 November 2023; Accepted: 23 January 2024;
Published: 2 February 2024.
Edited by:
Sobia Bilal, University of Illinois Chicago, United StatesReviewed by:
Nagwa Nashat Hegazy, University of Menoufia, Egypt© 2024 Rayyan. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.
*Correspondence: Mohammad Ramadan Rayyan dr_rayyan@riyadh.edu.sa