Skip to main content

OPINION article

Front. Med., 03 July 2024
Sec. Regulatory Science
This article is part of the Research Topic Regulatory Science and Meta Science as Components of Regulation of Medical Products and Practices View all 4 articles

Enhancing the quality and efficiency of regulatory science literature reviews through innovation and collaboration with library and information science experts

  • 1Department of Population Health, New York University Grossman School of Medicine, New York, NY, United States
  • 2Health Sciences Library, New York University Grossman School of Medicine, New York, NY, United States

Introduction

The importance of literature reviews as part of regulatory science is widely understood. From large trials to follow-up case studies, capturing existing research is key to creating safe and effective regulatory policies. With a shift toward incorporating additional evidence-types—beyond randomized controlled trials (RCTs)—into regulatory decisions and the potential for literature reviews themselves to generate new data (17), the importance of high-quality literature reviews is increasing. However, rigorous and comprehensive literature reviews—that ensure important data are not missed (8)—take substantial effort and can be difficult to do well. To encourage both comprehensiveness and efficiency in regulatory science literature reviews, additional focus is needed to integrate rigorous information science methodologies into regulatory science search strategies, as well as to develop innovative tools and strategies that can augment traditional search methods.

Meta-research (research on research) provides a way to examine the efficiency, quality, and potential bias in the overall research ecosystem (9). By applying a meta-research lens to literature reviews in regulatory science, we seek to elaborate on the challenges to performing high quality literature review data retrieval in regulatory science and identify potential strategies that may be explored to improve literature review quality and efficiency. Specifically, we hope to highlight the value of engaging library and information science expertise in the data retrieval process, including the strategic development of new controlled vocabulary and enhancement of conventional search techniques with novel digital technologies.

Challenges facing literature review in regulatory science

Due to the scope of potentially relevant materials, regulatory expectations, and an increasingly large body of research (10), regulatory science researchers face particular challenges to successfully identifying and retrieving the data they need. Identifying the potential obstacles to effective and high-quality literature review, both unique to regulatory research, as well as those experienced by other disciplines, is important for determining potential solutions to these barriers. The challenges identified fall into two main categories: diversity of potential data sources and breadth of potential search terms (11). These challenges are then overlain by the various stipulations put forth by regulatory bodies, such as post-marketing safety reports (12), thus, contributing to the importance of rapid turn around and the need for repeated searches (13).

Data relevant to a specific regulatory science research question may be found throughout a multitude of databases and repositories that house peer-reviewed research studies and/or gray literature (14). Some databases have a broad topical and geographic focus, whereas others may be more specialized or include language-specific resources. While there is often significant overlap among database contents, some materials will only be found in select databases; this necessitates a detailed knowledge of database characteristics to identify relevant search locations, as well as a search strategy that spans multiple databases (15).

Gray literature can require an additional level of awareness and expertise to identify, as these non-peer-reviewed data sources are often not indexed in the same manner as peer-reviewed literature. Conference proceedings, institutional reports, interim results, and other types of gray literature may be the first data sources to publish important adverse reaction outcomes and therefore may be necessary to identify. These materials, however, are found in a diverse set of platforms that are not always well-known or easily searchable (16). Similarly, relevant data may be found as secondary analyses within published clinical studies. Thus, additional expertise may be required to search beyond a title or abstract to identify potentially relevant literature. Furthermore, not all published materials are of equivalent quality, and in the age of paper mills, predatory journals (17), fraudulent citations (18), and paper retractions (19), having the expertise to discern which sources are reliable is needed to produce a high quality regulatory literature review (20, 21).

In addition to the diverse locations in which it can be found, identifying relevant regulatory data can require an extensive list of search terms, as there can be inconsistency across geography, disciplines, and historically in how similar topics are described. To capture literature relevant for answering a research question, identifying an appropriate set of search terms is a critical step (22, 23). Additionally, databases often employ a controlled vocabulary for indexing articles (such as PubMed's Medical Subject Headings, MeSH). A search strategy encompassing multiple databases will often require translating the search into the controlled vocabulary of each database (24). Defining search terms, can also pose a particular challenge depending on whether a topic has differing terminology in multiple disciplines, terminology that has changed over time, or a limited maturity of the research field leading to less well-characterized key terms (25, 26).

Regulatory bodies often have requirements for the collection of data published in other countries, all drug or device formulations, and across the development spectrum. This can contribute to the need to identify appropriate vocabulary that accounts for foreign languages, spans all stages of development from pre-clinical to clinical studies, includes synonyms and older terms that have been replaced by more current terms, and captures all ways in which related products may be labeled. Furthermore, as the field of regulatory science has sought to include non-RCT evidence (17), additional vocabulary is needed to identify alternate study designs and data sources.

A role for library and information science experts

While in the age of rapid online literature searching the role of a librarian or other information science experts in literature reviews may feel unnecessary, the role of these experts has become even more vital to the production of high quality, thorough, and expedient regulatory science literature reviews. Finding all of the existing relevant research is a key component in performing effective regulatory research. However, with a multitude of databases and variations in search terms, understanding how and where to identify relevant data is not a simple task and requires strategy and knowledge of data systems. Therefore, studies that choose to include those with library and information science expertise on their teams are at a significant advantage for producing high quality literature reviews with less overall effort (2729).

Indeed, a whole field of library and information science has evolved to develop methods for the management and discoverability of research data for research synthesis, whether in the published literature or in data repositories (30). Those skilled in information science can provide invaluable insights into the best approaches to identify potentially relevant literature (2729). Information science experts can improve how controlled vocabulary can be used or improved for a specific research topic, as well as generate high quality search strategies.

Potential solutions to reduce the burden of information retrieval and maximize the quality of regulatory science literature reviews may be found in the expanded use of existing controlled vocabulary and strategic development of new controlled vocabulary. The use of controlled vocabulary, or indexing terms, can facilitate and improve the accuracy of literature searches being performed (31, 32), and are a powerful tool that can overcome some of the limitations seen in a basic keyword search. Without the use of controlled vocabulary, search results would be limited to specific keywords or phrases used in the text, which could lead to irrelevant or incomplete results due to inconsistences in terminology used or alternate spellings. Thus, the standardization provided by controlled vocabulary helps ensure research is discoverable and accessible to a wider audience. Therefore, having knowledge of the controlled vocabulary for each database is a valuable skillset that can improve the comprehensiveness of a literature review. However, controlled vocabulary does have its limitations and not all important areas of research have an existing controlled vocabulary available (33, 34). Furthermore, gray literature sources are typically not indexed with controlled vocabulary. Librarians or those with an information science expertise can assist in search strategies to overcome the shortcomings of controlled vocabulary, as well as help advance the strategic development of new controlled vocabulary specific to regulatory research (35, 36).

A role for technological innovation

Even with a meticulously designed search strategy, the sheer amount of research findings in existence creates an important challenge to successfully and efficiently identifying all relevant research. Technological innovations, such as natural language processing (NLP) and generative artificial intelligence (GenAI), have the potential to provide alternative or supplementary approaches for identifying relevant research studies to address gaps and inefficiencies in information retrieval. Automation may help address the high volume of items in need of screening due to search term limitations by increasing search specificity and reducing the workload burden associated with manually screening all potentially appropriate items identified during literature search (37). However, there is significant work to be done before automation tools are viewed as successfully replacing traditional search methods (38, 39).

During the literature search process, a balance must be struck between achieving the sensitivity necessary to identify all relevant documents while maintaining specificity to reduce the amount of irrelevant data collected. When using classic systematic search strategies aimed to maximize sensitivity, this can result in a large number of irrelevant documents that must be screened manually. Despite the importance of comprehensive information retrieval for high quality literature reviews, much of the effort placed into automating regulatory science data retrieval has been focused on enhancing the efficiency of article screening and data extraction (40), rather than improving the comprehensiveness and quality of data being collected in the literature search itself. Efforts for automating the review need to not just decrease workload but also improve the quality of the data being identified (41).

For topics and data sources that do not have an extensive controlled vocabulary, NLP tools can potentially assist researchers by providing an alternate method for identifying relevant literature. NLP combines computational linguistics, machine learning, and deep learning models to process human language to allow a computer program to understand human language as it is spoken and written, rather than relying on predefined vocabulary or the presence of specific phrases or terms within the text. This type of automation can equip a researcher to increase the specificity of their search strategy while maintaining sensitivity, as NLP can capture a broader range of vocabulary combinations, while providing phrase interpretation to eliminate data sources that use similar vocabulary, but in irrelevant contexts.

During the regulatory research literature review process, additional sources of information may be gleamed from within the reference lists of identified documents (42). But identifying and evaluating these citations can be time consuming and existing technology approaches have demonstrated suboptimal performance (38, 43, 44). GenAI tools can be developed to scan through research documents to identify and extract potentially relevant citations from a document's reference section for further review (43). Furthermore, the summarization capabilities of GenAI can reduce the burden of the screening process. Relevant data in a document may be contained within secondary analyses or sub-studies that are not reported in the document title or abstract, making the identification of these data sources time consuming, as they require a full-text review. GenAI tools can be developed to analyze the full text and provide a summary of the data to minimize the effort required to screen these sources. While some screening and summarization tools are available, many have not been validated for the literature review data extraction and screening process (39, 4549). Collaboration with an information science expert may improve the process of developing tools with improved validity.

To truly make these technologies work for regulatory science research, however, concerted investment is needed to develop and adapt these tools to meet the needs of the regulatory science researcher and meet systematic review standards (39). A collaboration between regulatory scientists and information science experts can serve to facilitate the development of effective and relevant review tools. To ensure reliability, GenAI and NLP approaches need to be validated on search use cases. However, the creation of datasets for training and validation can be time intensive and requires existing literature search expertise (50). In additional to collaborating on tool development, librarians and data science experts can be an important resource to evaluate the validity and utility of tools being developed within industry. Furthermore, attention must be given to the current limitations of using even “well-trained” GenAI. While adept at information generation and summarization, GenAI cannot discern the quality of the data sources it pulls from (51). Therefore, additional training will likely be necessary for a GenAI model to distinguish high- and low-quality data sources from an information science perspective (e.g., predatory journals) as well as discern what is a high-quality data source as per standards set forth by regulatory bodies.

Conclusion

Finding all existing relevant research is a key component to performing effective regulatory science research. However, regulatory science researchers face particular challenges to identifying and retrieving relevant data in an increasingly large body of research findings. With a high burden associated with sifting through irrelevant sources and a growing number of resources that may not be identified when using traditional or less sophisticated search strategies, performing high quality and efficient regulatory science reviews is a challenge. To address these barriers, further efforts are needed to increase the integration of rigorous information science expertise into regulatory science search strategies, as well as to develop innovative tools and strategies that can augment traditional search methods.

Author contributions

ES: Conceptualization, Methodology, Writing – original draft. GL: Conceptualization, Methodology, Writing – review & editing.

Funding

The author(s) declare financial support was received for the research, authorship, and/or publication of this article. This work was funded by NIA 1K01AG075169-01A1 (ES). The study sponsor was not involved in the conception or preparation of this manuscript.

Conflict of interest

The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

Publisher's note

All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article, or claim that may be made by its manufacturer, is not guaranteed or endorsed by the publisher.

References

1. Barbui C, Addis A, Amato L, Traversa G, Garattini S. Can systematic reviews contribute to regulatory decisions? Eur J Clin Pharmacol. (2017) 73:507–9. doi: 10.1007/s00228-016-2194-y

PubMed Abstract | Crossref Full Text | Google Scholar

2. U.S. Food and Drug Administration. 21st Century Cures Act. (2020). Available online at: https://www.fda.gov/regulatory-information/selected-amendments-fdc-act/21st-century-cures-act (accessed June 17, 2024).

Google Scholar

3. European Food Safety Authority EFS. Dealing With Evidence. (2024). Available online at: https://www.efsa.europa.eu/en/methodology/evidence (accessed June 17, 2024).

Google Scholar

4. . Regulatory Science Research Needs 1.0. (2022). Available online at: https://www.ema.europa.eu/en/documents/other/regulatory-science-research-needs_en.pdf (accessed June 17, 2024).

Google Scholar

5. Burns L, Roux NL, Kalesnik-Orszulak R, Christian J, Hukkelhoven M, Rockhold F, et al. Real-world evidence for regulatory decision-making: guidance from around the world. Clin Ther. (2022) 44:420–37. doi: 10.1016/j.clinthera.2022.01.012

PubMed Abstract | Crossref Full Text | Google Scholar

6. Beaulieu-Jones BK, Finlayson SG, Yuan W, Altman RB, Kohane IS, Prasad V, et al. Examining the use of real-world evidence in the regulatory process. Clin Pharmacol Ther. (2020) 107:843–52. doi: 10.1002/cpt.1658

PubMed Abstract | Crossref Full Text | Google Scholar

7. Barbui C, Patten SB. Safety of psychotropic medicines: looking beyond randomised evidence. Epidemiol Psychiatr Sci. (2018) 27:529–30. doi: 10.1017/S2045796018000306

PubMed Abstract | Crossref Full Text | Google Scholar

8. Ewald H, Klerings I, Wagner G, Heise TL, Stratil JM, Lhachimi SK, et al. Searching two or more databases decreased the risk of missing relevant studies: a metaresearch study. J Clin Epidemiol. (2022) 149:154–64. doi: 10.1016/j.jclinepi.2022.05.022

PubMed Abstract | Crossref Full Text | Google Scholar

9. Harpe SE. Meta-research in pharmacy: time for a look in the mirror. Res Social Adm Pharm. (2021) 17:2028–35. doi: 10.1016/j.sapharm.2021.04.006

PubMed Abstract | Crossref Full Text | Google Scholar

10. National Science Board NSF. Publications Output: U.S. Trends and International Comparisons. Science and Engineering Indicators (2024). NSB-2023-33; Alexandria, VA. Available online at: https://ncses.nsf.gov/pubs/nsb202333/ (accessed June 17, 2024).

Google Scholar

11. MacFarlane A, Russell-Rose T, Shokraneh F. Search strategy formulation for systematic reviews: Issues, challenges and opportunities. Intell Syst Appl. (2022) 15:200091. doi: 10.1016/j.iswa.2022.200091

PubMed Abstract | Crossref Full Text | Google Scholar

12. U.S. Food and Drug Administration. Postmarketing Safety Reporting for Human Drug and Biological Products Including Vaccines. (2020). Available online at: https://www.fda.gov/regulatory-information/search-fda-guidance-documents/postmarketing-safety-reporting-human-drug-and-biological-products-including-vaccines (accessed June 17, 2024).

Google Scholar

13. Honig P, Zhang L. Regulation and innovation: role of regulatory science in facilitating pharmaceutical innovation. Clin Pharmacol Ther. (2019) 105:778–81. doi: 10.1002/cpt.1367

PubMed Abstract | Crossref Full Text | Google Scholar

14. Gusenbauer M. The age of abundant scholarly information and its synthesis– A time when ‘just google it' is no longer enough. Res Synth Methods. (2021) 12:684–91. doi: 10.1002/jrsm.1520

PubMed Abstract | Crossref Full Text | Google Scholar

15. Wanyama SB, McQuaid RW, Kittler M. Where you search determines what you find: the effects of bibliographic databases on systematic reviews. Int J Soc Res Methodol. (2022) 25:409–22. doi: 10.1080/13645579.2021.1892378

Crossref Full Text | Google Scholar

16. Saleh AA, Ratajeski MA, Bertolet M. Grey literature searching for health sciences systematic reviews: a prospective study of time spent and resources utilized. Evid Based Libr Inf Pract. (2014) 9:28–50. doi: 10.18438/B8DW3K

PubMed Abstract | Crossref Full Text | Google Scholar

17. Richtig G, Berger M, Lange-Asschenfeldt B, Aberer W, Richtig E. Problems and challenges of predatory journals. J Eur Acad Dermatol Venereol. (2018) 32:1441–9. doi: 10.1111/jdv.15039

PubMed Abstract | Crossref Full Text | Google Scholar

18. Májovský M, Cerný M, Kasal M, Komarc M, Netuka D. Artificial intelligence can generate fraudulent but authentic-looking scientific medical articles: pandora's box has been opened. J Med Int Res. (2023) 25:e46924. doi: 10.2196/46924

PubMed Abstract | Crossref Full Text | Google Scholar

19. Van Noorden R. More than 10,000 research papers were retracted in 2023—a new record. Nature. (2023) 624:479–81. doi: 10.1038/d41586-023-03974-8

PubMed Abstract | Crossref Full Text | Google Scholar

20. Gerberi D, Taylor JM, Beeler CJ. Educating authors and users of the literature to increase vigilance of predatory publishing. J Hosp Librariansh. (2021) 21:207–16. doi: 10.1080/15323269.2021.1942691

Crossref Full Text | Google Scholar

21. Ross-White A, Godfrey CM, Sears KA, Wilson R. Predatory publications in evidence syntheses. J Med Libr Assoc. (2019) 107:57–61. doi: 10.5195/jmla.2019.491

PubMed Abstract | Crossref Full Text | Google Scholar

22. Bramer WM, de Jonge GB, Rethlefsen ML, Mast F, Kleijnen J. A systematic approach to searching: an efficient and complete method to develop literature searches. J Med Libr Assoc. (2018) 106:531–41. doi: 10.5195/jmla.2018.283

PubMed Abstract | Crossref Full Text | Google Scholar

23. Higgins JP, Thomas J, Chandler J, Cumpston M, Li T, Page MJ, et al. Cochrane Handbook for Systematic Reviews of Interventions. Chichester: John Wiley & Sons (2019).

Google Scholar

24. Parker RMN, Neilson MJ. Lost in Translation: supporting learners to search comprehensively across databases. J Can Health Librar. (2015) 36:54–8. doi: 10.5596/c15-014

Crossref Full Text | Google Scholar

25. Paul J, Barari M. Meta-analysis and traditional systematic literature reviews—What, why, when, where, and how? Psychol Market. (2022) 39:1099–115. doi: 10.1002/mar.21657

PubMed Abstract | Crossref Full Text | Google Scholar

26. Card NA. Applied Meta-Analysis for Social Science Research. New York, NY: Guilford Publications (2015).

Google Scholar

27. Rethlefsen ML, Farrell AM, Osterhaus Trzasko LC, Brigham TJ. Librarian co-authors correlated with higher quality reported search strategies in general internal medicine systematic reviews. J Clin Epidemiol. (2015) 68:617–26. doi: 10.1016/j.jclinepi.2014.11.025

PubMed Abstract | Crossref Full Text | Google Scholar

28. Aamodt M, Huurdeman HC, Strømme H. Librarian co-authored systematic reviews are associated with lower risk of bias compared to systematic reviews with acknowledgement of librarians or no participation by librarians. Evid Based Libr Inf Pract. (2019) 14, 103–127. doi: 10.18438/eblip29601

Crossref Full Text | Google Scholar

29. Ibragimova I, Fulbright H. Librarians and information specialists as methodological peer-reviewers: a case-study of the International Journal of Health Governance. Res Integr Peer Rev. (2024) 9:1. doi: 10.1186/s41073-023-00142-4

PubMed Abstract | Crossref Full Text | Google Scholar

30. Smith LC. Reviews and reviewing: approaches to research synthesis. An Annual Review of Information Science and Technology (ARIST) paper. J Assoc Inf Sci Technol. (2024) 75:245–67. doi: 10.1002/asi.24851

Crossref Full Text | Google Scholar

31. Lindberg DA, Humphreys BL, McCray AT. The unified medical language system. Methods Inf Med. (1993) 32:281–91. doi: 10.1055/s-0038-1634945

PubMed Abstract | Crossref Full Text | Google Scholar

32. Doig GS, Simpson F. Efficient literature searching: a core skill for the practice of evidence-based medicine. Intens Care Med. (2003) 29:2119–27. doi: 10.1007/s00134-003-1942-5

PubMed Abstract | Crossref Full Text | Google Scholar

33. Bass MB, Molina RL, Reardon EE, Porcello L, Goldberg J, Nickum A, et al. Trends in…controlled vocabulary and health equity. Med Ref Serv Q. (2022) 41:185–201. doi: 10.1080/02763869.2022.2060638

PubMed Abstract | Crossref Full Text | Google Scholar

34. McTavish JR, Rasmussen Neal D, Wathen CN. Is what you see what you get? Medical subject headings and their organizing work in the violence against women research literature article. Knowl Org. (2011) 38:381–97. doi: 10.5771/0943-7444-2011-5-381

PubMed Abstract | Crossref Full Text | Google Scholar

35. Stevens EL. Recognizing the value of meta-research and making it easier to find. J Med Libr Assoc. (2023) 111:1758. doi: 10.5195/jmla.2023.1758

PubMed Abstract | Crossref Full Text | Google Scholar

36. Logan J. Why do researchers co-author evidence syntheses with librarians? A mixed-methods study. Res Synth Methods. (2023) 14:489–503. doi: 10.1002/jrsm.1629

PubMed Abstract | Crossref Full Text | Google Scholar

37. Tang F-SK-B, Bukowski M, Schmitz-Rode T, Farkas R. Guidance for Clinical evaluation under the medical device regulation through automated scoping searches. Appl Sci. (2023) 13:7639. doi: 10.3390/app13137639

Crossref Full Text | Google Scholar

38. O'Connor AM, Tsafnat G, Thomas J, Glasziou P, Gilbert SB, Hutton B, et al. Question of trust: can we build an evidence base to gain trust in systematic review automation technologies? Syst Rev. (2019) 8:143. doi: 10.1186/s13643-019-1062-0

PubMed Abstract | Crossref Full Text | Google Scholar

39. O'Connor AM, Glasziou P, Taylor M, Thomas J, Spijker R, Wolfe MS, et al. Focus on cross-purpose tools, automated recognition of study design in multiple disciplines, and evaluation of automation tools: a summary of significant discussions at the fourth meeting of the International Collaboration for Automation of Systematic Reviews (ICASR). Syst Rev. (2020) 9:100. doi: 10.1186/s13643-020-01351-4

PubMed Abstract | Crossref Full Text | Google Scholar

40. Halamoda-Kenzaoui B, Rolland E, Piovesan J, Puertas Gallardo A, Bremer-Hoffmann S. Toxic effects of nanomaterials for health applications: how automation can support a systematic review of the literature? J Appl Toxicol. (2022) 42:41–51. doi: 10.1002/jat.4204

PubMed Abstract | Crossref Full Text | Google Scholar

41. Laynor G. Can systematic reviews be automated? J Electron Resour Med Libr. (2022) 19:101–6. doi: 10.1080/15424065.2022.2113350

Crossref Full Text | Google Scholar

42. Wohlin C, Kalinowski M, Romero Felizardo K, Mendes E. Successful combination of database search and snowballing for identification of primary studies in systematic literature studies. Inf Softw Technol. (2022) 147:106908. doi: 10.1016/j.infsof.2022.106908

Crossref Full Text | Google Scholar

43. Bakker C, Theis-Mahon N, Brown SJ. Evaluating the accuracy of scite, a smart citation index. Hypothesis. (2023) 35:26528. doi: 10.18060/26528

Crossref Full Text | Google Scholar

44. Allot A, Lee K, Chen Q, Luo L, Lu Z. LitSuggest: a web-based system for literature recommendation and curation using machine learning. Nucleic Acids Res. (2021) 49:W352–8. doi: 10.1093/nar/gkab326

PubMed Abstract | Crossref Full Text | Google Scholar

45. Goodwin Burri K. Intelligent use of artificial intelligence for systematic reviews of medical devices. Article Med Writing. (2019) 28:22–6.

Google Scholar

46. Gramoun A. Digital tools for the clinical evaluation of medical devices: a guide to empower regulatory writers. Med Writing. (2023) 32:16–21. doi: 10.56012/hmts9332

Crossref Full Text | Google Scholar

47. Perlman-Arrow S, Loo N, Bobrovitz N, Yan T, Arora RK. A real-world evaluation of the implementation of NLP technology in abstract screening of a systematic review. Res Synth Methods. (2023) 14:608–21. doi: 10.1002/jrsm.1636

PubMed Abstract | Crossref Full Text | Google Scholar

48. Yao X, Kumar MV, Su E, Flores Miranda A, Saha A, Sussman J. Evaluating the efficacy of artificial intelligence tools for the automation of systematic reviews in cancer research: a systematic review. Cancer Epidemiol. (2024) 88:102511. doi: 10.1016/j.canep.2023.102511

PubMed Abstract | Crossref Full Text | Google Scholar

49. Marshall IJ, Trikalinos TA, Soboczenski F, Yun HS, Kell G, Marshall R, et al. In a pilot study, automated real-time systematic review updates were feasible, accurate, and work-saving. J Clin Epidemiol. (2023) 153:26–33. doi: 10.1016/j.jclinepi.2022.08.013

PubMed Abstract | Crossref Full Text | Google Scholar

50. Wu L, Chen S, Guo L, Shpyleva S, Harris K, Fahmi T, et al. Development of benchmark datasets for text mining and sentiment analysis to accelerate regulatory literature review. Regul Toxicol Pharmacol. (2023) 137:105287. doi: 10.1016/j.yrtph.2022.105287

PubMed Abstract | Crossref Full Text | Google Scholar

51. Hersh W. Search still matters: information retrieval in the era of generative AI. J Am Med Inform Assoc. (2024) ocae014. doi: 10.1093/jamia/ocae014

PubMed Abstract | Crossref Full Text | Google Scholar

Keywords: meta-research, literature review methods, information retrieval, regulatory science, collaboration

Citation: Stevens ER and Laynor G (2024) Enhancing the quality and efficiency of regulatory science literature reviews through innovation and collaboration with library and information science experts. Front. Med. 11:1434427. doi: 10.3389/fmed.2024.1434427

Received: 17 May 2024; Accepted: 24 June 2024;
Published: 03 July 2024.

Edited by:

Christine Gispen-de Wied, Gispen4RegulatoryScience, Netherlands

Reviewed by:

Julie Monk, Monash University, Australia
Diane Schwartz, University at Buffalo, United States

Copyright © 2024 Stevens and Laynor. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

*Correspondence: Elizabeth R. Stevens, elizabeth.stevens@nyulangone.org

Disclaimer: All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article or claim that may be made by its manufacturer is not guaranteed or endorsed by the publisher.