
94% of researchers rate our articles as excellent or good
Learn more about the work of our research integrity team to safeguard the quality of each article we publish.
Find out more
OPINION article
Front. Educ., 26 March 2025
Sec. Digital Education
Volume 10 - 2025 | https://doi.org/10.3389/feduc.2025.1570389
This article is part of the Research TopicDigital Learning Innovations: Trends Emerging Scenario, Challenges and OpportunitiesView all 8 articles
As AI technologies continue to advance and influence healthcare, it is imperative that medical education evolves to equip future physicians with the necessary skills and understanding of AI applications. In pre-clinical curricula, AI tools can streamline administrative processes, enhance teaching methodologies, and provide personalized learning experiences for students. Moreover, AI has the potential to shape students' professionalism, ethical decision-making, and critical thinking skills, which are essential for their future roles in the medical field. However, this shift also raises critical challenges, such as the need for ethical guidelines, adequate infrastructure, and avoiding over-reliance on AI. Recent studies have shown a significant gap in formal AI education within medical studies, while the general attitudes of students toward AI is positive (Bisdas et al., 2021) This paper is based on a comprehensive narrative review of current literature and expert discussions. It aims to identify the key areas affected by AI integration in pre-clinical medical education, and to provide recommendations to maximize its benefits while mitigating associated risks.
Universities face significant organizational challenges in integrating AI, such as ensuring ethical AI usage, addressing data privacy concerns, and meeting infrastructural requirements. Developing comprehensive policies to guide AI's role in academic settings is essential, including its application in exams, theses, teaching, and clinical decision-making tools (Bisdas et al., 2021; Karabacak et al., 2023). Policy development requires good cooperation between academic self-governance bodies (academic freedom) and university management (management of personnel and financial resources, and enforcement of rules; Abdul Kadir and Schütze, 2022).
One significant opportunity presented by AI integration is the optimization of administrative tasks. AI-driven data analysis can improve resource management, allowing universities to allocate resources more efficiently and effectively (Knopp et al., 2023; Peacock et al., 2023). Additionally, AI can enhance global collaboration in medical education by facilitating communication and information sharing across institutions worldwide. Implementing digital learning infrastructures, such as virtual simulation labs and AI-assisted learning platforms, has the potential to improve teaching efficiency and provide innovative educational experiences for students (Xu et al., 2024; Wu et al., 2024).
However, several challenges persist. Universities must invest in digital technologies while balancing traditional educational needs to prevent over-reliance on AI (Wu et al., 2024; Kasneci et al., 2023). Ensuring equal and fair access to AI tools for all students is crucial to avoid disparities in educational opportunities. The freedom of teaching should not be impaired by university-wide adjustments. Establishing ethical frameworks is imperative to promote the responsible use of AI, especially concerning patient data in AI models (Wu et al., 2024; Li et al., 2024; Borenstein and Howard, 2021; Gordon et al., 2024). Addressing these ethical considerations is essential for safeguarding data privacy and upholding academic integrity.
At the program management level, aligning the needs of students and educators regarding AI usage in education is crucial. This alignment impacts curriculum design, coordination among educators, and the preparation of students for future job roles in an AI-influenced healthcare environment (Civaner et al., 2022).
One of the primary challenges is ensuring that curricula remain relevant amid AI's growing impact on healthcare. This necessitates significant restructuring of courses to integrate AI tools while maintaining the essential human elements of medical education, such as patient interaction and ethical decision-making (Li et al., 2024; Chan and Hu, 2023; Khan et al., 2023; Rasouli et al., 2024). Program managers must balance the incorporation of new technologies with the preservation of core medical competencies to provide a comprehensive education under conditions in which these core competencies are continually reassessed.
Conversely, AI offers opportunities to develop innovative course structures that include experiential components. For instance, AI simulations for clinical decision-making can enhance learning outcomes by providing students with practical, hands-on experience in a controlled environment. Additionally, AI can help identify future job market needs, enabling study programs to adapt their curricula and train students in AI-driven medical technologies (Bisdas et al., 2021; Peacock et al., 2023; Wu et al., 2022). This proactive approach ensures that graduates are better prepared for the evolving demands of the healthcare sector.
For teaching staff, integrating AI into medical curricula demands the acquisition of new skills and methodologies, as well as an understanding of how to balance AI with traditional teaching methods. AI tools such as virtual tutors and simulation resources have the potential to greatly enhance the learning experience, but only when applied appropriately. Therefore, educators need to be trained adequately to use these tools effectively, and they need to be willing to receive that training (Peacock et al., 2023; Li et al., 2024).
Opportunities for educators include utilizing AI-enhanced teaching tools, including virtual simulations and real-time feedback systems, which provide more personalized and adaptive learning experiences for students (Chan and Hu, 2023; Zhang et al., 2024). AI can also streamline assessment and grading processes, allowing educators to devote more time to developing students' higher-order thinking skills (Knopp et al., 2023). By reducing administrative burdens, educators can focus on facilitating critical analysis, problem-solving abilities, and clinical reasoning.
Nevertheless, challenges persist. Training educators in AI usage is essential to ensure they can leverage these tools effectively and confidently integrate them into their teaching practices (Knopp et al., 2023; Wu et al., 2024). Additionally, ethical concerns regarding the use of AI in education, such as issues of academic integrity and potential biases in AI algorithms, need to be carefully managed (Alam et al., 2023). Educators must be equipped not only with technical skills but also with an understanding of the ethical implications of AI to guide students appropriately.
Students can benefit significantly from AI through personalized learning experiences and real-time feedback, which allow them to engage more deeply with educational content (Skryd and Lawrence, 2024). AI technologies can tailor educational materials to individual learning styles and pace, enhancing comprehension and retention of information.
However, the risk of over-reliance on AI technologies poses a significant challenge. There is a potential for undermining critical thinking and problem-solving skills if students become too dependent on AI for answers and guidance (Zhang et al., 2024). Developing digital literacy skills is essential for students to critically assess AI-generated information and to use these tools as aids rather than crutches.
Opportunities for students include accessing tailored learning resources based on their individual progress and needs (Kasneci et al., 2023). AI-driven tools can assist with time-consuming tasks such as literature reviews and data analysis, allowing students to focus on more complex aspects of their studies (Wu et al., 2024). This can enhance their learning efficiency and depth of understanding.
Challenges include managing academic integrity, particularly concerning AI-assisted assignments and exams (Chan and Hu, 2023). Students must be educated about the ethical use of AI and the importance of producing original work. Additionally, disparities in access to AI technologies can both bridge and widen gaps in education, as the availability, functionality, and applicability of tools may vary depending on university provisions and the individual social backgrounds of students. Ensuring equitable access and fostering an inclusive learning environment are essential to prevent exacerbating existing inequalities.
University administration plays a pivotal role in the effective integration of AI into medical education. Firstly, establishing clear ethical guidelines on AI usage in exams and research is essential, with a focus on ethical concerns such as data privacy, copyright, and academic integrity, particularly in the clinical context (Wu et al., 2024; Li et al., 2024; Borenstein and Howard, 2021). Promoting inclusivity is also crucial; universities should ensure equal access to AI technologies for all students by providing necessary resources and support, thereby preventing disparities in educational opportunities. Investment in robust digital infrastructure is required, including AI-based tools for learning management systems and virtual simulation environments, to support both students and faculty effectively (Wu et al., 2024; Li et al., 2024). Implementing training programs for teaching staff on the integration of AI into curricula will ensure educators are equipped to use AI tools effectively (Knopp et al., 2023; Li et al., 2024). Moreover, developing frameworks for regulation and oversight is important to monitor AI's impact on student performance and academic integrity, ensuring fair and equitable application across all areas (Vilalta-Perdomo et al., 2023).
Study program management must adapt curricula to remain relevant in the face of AI's growing impact on healthcare. This involves restructuring existing curricula by integrating AI-based tools into relevant courses while maintaining critical aspects of patient interaction and human contact (Peacock et al., 2023; Xu et al., 2024). Developing specific AI-focused courses is recommended to introduce students to basic AI concepts, current AI technologies in healthcare, and future applications of AI in medicine (Knopp et al., 2023; Wu et al., 2024). Encouraging interdisciplinary integration through collaboration across departments can offer students opportunities to work together on projects that solve healthcare problems using AI (Knopp et al., 2023). Ensuring alignment of taught skills and competencies with evolving job market needs will help future physicians be prepared for AI-integrated healthcare settings (Civaner et al., 2022). Additionally, implementing appropriate assessment methods within AI teaching environments is necessary to accurately determine students' performance and scientific work, providing motivation for further development (Boscardin et al., 2024).
Educators should embrace didactic innovation by incorporating AI tools to complement traditional teaching methods. This includes utilizing virtual tutors and AI-driven feedback systems while maintaining face-to-face interaction for the development of complex skills (Wu et al., 2024; Kasneci et al., 2023; Zhang et al., 2024). Aligning curriculum development with AI-driven learning by integrating experiential, problem-solving, and simulation components can enhance students' critical thinking abilities (Xu et al., 2024; Li et al., 2024; Chan and Hu, 2023). Using AI tools to streamline grading and offer personalized feedback is beneficial, but educators must ensure proper oversight to maintain academic integrity (Bisdas et al., 2021; Vilalta-Perdomo et al., 2023; Tangadulrat et al., 2023). Furthermore, identifying and demonstrating complementary skills to AI functionalities in practical settings can provide students with context and relevance to their theoretical knowledge (Katsamakas et al., 2024).
Students should be trained in AI literacy, emphasizing the responsible use of AI tools and the necessity of critical thinking and scientific rigor when interpreting AI-generated results (Borenstein and Howard, 2021; Weidener and Fischer, 2023; Katyal et al., 2024; Jebreen et al., 2024; Nguyen, 2024). Encouraging students to use AI tools to complement, rather than replace, traditional study methods can promote independent learning and problem-solving, as well as accelerate time-consuming tasks like information extraction and summarizing (Xu et al., 2024; Li et al., 2024). Equipping students with the skills to critically evaluate AI-driven diagnostic tools and integrate AI technologies into daily clinical practice is essential for their future roles in healthcare (Tangadulrat et al., 2023).
The integration of AI into medical education represents a paradigm shift that offers both transformative opportunities and significant challenges. Universities must carefully navigate ethical considerations, infrastructural demands, and pedagogical concerns to maximize AI's benefits while mitigating associated risks. AI tools have the potential not only to enhance learning but also to influence the development of students' professionalism and ethical decision-making skills, which are essential for their future roles as physicians.
In the context of increasing competition among universities, the integration of AI into administrative, teaching, and research activities becomes a strategic imperative. Implementing AI technologies can differentiate institutions, create added value, and provide relevant educational pathways that align with desired job profiles in the healthcare market. Recognizing the opportunities presented by AI allows higher education institutions to identify complementary skills that should be incorporated into curricula (Katsamakas et al., 2024). To overcome potential resistance to change within institutions, it is crucial to communicate clearly about the opportunities and risks of AI integration and to involve stakeholders in the planning process.
This paper has outlined a structured approach to AI integration, emphasizing the importance of ethical oversight, balanced curriculum development, and adequate training for both educators and students. By addressing the identified challenges and implementing the proposed recommendations, universities can better prepare future physicians to operate in an AI-enhanced healthcare environment, ensuring that medical education evolves in tandem with technological advancements.
To fully harness AI's potential in medical education—particularly in the subsequent clinical training—future considerations must include collaboration with clinical partners. This collaboration is essential to ensure responsible AI integration, with a strong emphasis on the 'human factor' that AI cannot replicate. Engaging clinical partners will help bridge the gap between pre-clinical education and clinical practice, fostering a holistic approach to medical training that combines technological proficiency with essential human skills.
The integration of artificial intelligence into pre-clinical medical education presents both promising opportunities and notable challenges. By addressing ethical considerations, investing in infrastructure, and providing adequate training, universities can effectively incorporate AI into their curricula. Emphasizing the balance between technological innovation and the indispensable human aspects of medical practice will equip future physicians with the competencies required in an AI-influenced healthcare system.
BP: Conceptualization, Writing – original draft, Writing – review & editing. LM: Conceptualization, Writing – original draft, Writing – review & editing. SF: Writing – original draft, Writing – review & editing. K-EC: Writing – review & editing. RB: Conceptualization, Writing – review & editing. SH: Conceptualization, Writing – review & editing.
The author(s) declare that no financial support was received for the research and/or publication of this article.
The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.
The author(s) declare that Gen AI was used in the creation of this manuscript. Generative AI was used to shorten, rephrase and correct the original script.
All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article, or claim that may be made by its manufacturer, is not guaranteed or endorsed by the publisher.
Abdul Kadir, N., and Schütze, H. (2022). Medical educators' perspectives on the barriers and enablers of teaching public health in the undergraduate medical schools: a systematic review. Global Health Action. 15:2106052. doi: 10.1080/16549716.2022.2106052
Alam, F., Lim, M. A., and Zulkipli, I. N. (2023). Integrating AI in medical education: embracing ethical usage and critical understanding. Front. Med. 10:1279707. doi: 10.3389/fmed.2023.1279707
Bisdas, S., Topriceanu, C. C., Zakrzewska, Z., Irimia, A. V., Shakallis, L., Subhash, J., et al. (2021). Artificial intelligence in medicine: a multinational multi-center survey on the medical and dental students' perception. Front. Public Health 9:795284. doi: 10.3389/fpubh.2021.795284
Borenstein, J., and Howard, A. (2021). Emerging challenges in AI and the need for AI ethics education. AI Ethics 1, 61–65. doi: 10.1007/s43681-020-00002-7
Boscardin, C. K., Gin, B., Golde, P. B., and Hauer, K. E. (2024). ChatGPT and generative artificial intelligence for medical education: potential impact and opportunity. Acad. Med. 99:22. doi: 10.1097/ACM.0000000000005439
Chan, C. K. Y., and Hu, W. (2023). Students' voices on generative AI: perceptions, benefits, and challenges in higher education. Int. J. Educ. Technol. High Educ. 20:43. doi: 10.1186/s41239-023-00411-8
Civaner, M. M., Uncu, Y., Bulut, F., Chalil, E. G., and Tatli, A. (2022). Artificial intelligence in medical education: a cross-sectional needs assessment. BMC Med. Educ. 22:772. doi: 10.1186/s12909-022-03852-3
Gordon, M., Daniel, M., Ajiboye, A., Uraiby, H., Xu, N. Y., Bartlett, R., et al. (2024). A scoping review of artificial intelligence in medical education: BEME Guide No. 84. Medical Teacher. 46, 446–470. doi: 10.1080/0142159X.2024.2314198
Jebreen, K., Radwan, E., Kammoun-Rebai, W., Alattar, E., Radwan, A., Safi, W., et al. (2024). Perceptions of undergraduate medical students on artificial intelligence in medicine: mixed-methods survey study from Palestine. BMC Med. Educ. 24:507. doi: 10.1186/s12909-024-05465-4
Karabacak, M., Ozkara, B. B., Margetis, K., Wintermark, M., and Bisdas, S. (2023). The advent of generative language models in medical education. JMIR Med. Educ. 9:e48163. doi: 10.2196/48163
Kasneci, E., Sessler, K., Küchemann, S., Bannert, M., Dementieva, D., Fischer, F., et al. (2023). ChatGPT for good? On opportunities and challenges of large language models for education. Learn. Individ. Differ. 103:102274. doi: 10.1016/j.lindif.2023.102274
Katsamakas, E., Pavlov, O. V., and Saklad, R. (2024). Artificial intelligence and the transformation of higher education institutions. arXiv. doi: 10.3390/su16146118
Katyal, A., Chowdhury, S., Sharma, P. K., and Kannan, M. (2024). Fink's integrated course design and taxonomy: the impact of their use in an undergraduate introductory course on bioinformatics. J. Sci. Educ. Technol. 33, 493–504. doi: 10.1007/s10956-024-10100-4
Khan, R. A., Jawaid, M., Khan, A. R., and Sajjad, M. (2023). ChatGPT—Reshaping medical education and clinical management. Pak. J. Med. Sci. 39, 605–607. doi: 10.12669/pjms.39.2.7653
Knopp, M. I., Warm, E. J., Weber, D., Kelleher, M., Kinnear, B., Schumacher, D. J., et al. (2023). AI-enabled medical education: threads of change, promising futures, and risky realities across four potential future worlds. JMIR Med. Educ. 9:e50373. doi: 10.2196/50373
Li, Z., Li, F., Fu, Q., Wang, X., Liu, H., Zhao, Y., et al. (2024). Large language models and medical education: a paradigm shift in educator roles. Smart Learn. Environ. 11:26. doi: 10.1186/s40561-024-00313-w
Nguyen, T. (2024). ChatGPT in medical education: a precursor for automation bias? JMIR Med. Educ. 10:e50174. doi: 10.2196/50174
Peacock, J., Austin, A., Shapiro, M., Battista, A., and Samuel, A. (2023). Accelerating medical education with ChatGPT: an implementation guide. MedEdPublish 13:64. doi: 10.12688/mep.19732.2
Rasouli, S., Alkurdi, D., and Jia, B. (2024). The role of artificial intelligence in modern medical education and practice: a systematic literature review. medrxiv. doi: 10.1101/2024.07.25.24311022
Skryd, A., and Lawrence, K. (2024). ChatGPT as a tool for medical education and clinical decision-making on the wards: case study. JMIR Form Res. 8:e51346. doi: 10.2196/51346
Tangadulrat, P., Sono, S., and Tangtrakulwanich, B. (2023). Using ChatGPT for clinical practice and medical education: cross-sectional survey of medical students' and physicians' perceptions. JMIR Med. Educ. 9:e50658. doi: 10.2196/50658
Vilalta-Perdomo, E., Salinas-Navarro, D. E., Thierry-Aguilera, R., and Gerardou, F. S. (2023). Challenges and opportunities of generative AI for higher education as explained by ChatGPT. Educ. Sci. 13:856. doi: 10.3390/educsci13090856
Weidener, L., and Fischer, M. (2023). Teaching AI ethics in medical education: a scoping review of current literature and practices. Perspect. Med. Educ. 12, 399–410. doi: 10.5334/pme.954
Wu, Q., Wang, Y., Lu, L., Chen, Y., Long, H., Wang, J., et al. (2022). virtual simulation in undergraduate medical education: a scoping review of recent practice. Front. Med. 9:855403. doi: 10.3389/fmed.2022.855403
Wu, Y., Zheng, Y., Feng, B., Yang, Y., Kang, K., Zhao, A., et al. (2024). Embracing ChatGPT for medical education: exploring its impact on doctors and medical students. JMIR Med. Educ. 10:e52483. doi: 10.2196/52483
Xu, X., Chen, Y., and Miao, J. (2024). Opportunities, challenges, and future directions of large language models, including ChatGPT in medical education: a systematic scoping review. J. Educ. Eval. Health Prof. 21:6. doi: 10.3352/jeehp.2024.21.6
Keywords: artificial intelligence, teaching, learning, ChatGPT, medicine, medical education
Citation: Pohn B, Mehnen L, Fitzek S, Choi K-E, Braun RJ and Hatamikia S (2025) Integrating artificial intelligence into pre-clinical medical education: challenges, opportunities, and recommendations. Front. Educ. 10:1570389. doi: 10.3389/feduc.2025.1570389
Received: 03 February 2025; Accepted: 13 March 2025;
Published: 26 March 2025.
Edited by:
Amna Mirza, University of Delhi, IndiaReviewed by:
Syed Nadeem Fatmi, Jamia Millia Islamia, IndiaCopyright © 2025 Pohn, Mehnen, Fitzek, Choi, Braun and Hatamikia. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.
*Correspondence: Birgit Pohn, YmlyZ2l0LnBvaG5AdGVjaG5pa3VtLXdpZW4uYXQ=
†These authors have contributed equally to this work
Disclaimer: All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article or claim that may be made by its manufacturer is not guaranteed or endorsed by the publisher.
Research integrity at Frontiers
Learn more about the work of our research integrity team to safeguard the quality of each article we publish.