Skip to main content

PERSPECTIVE article

Front. Med., 06 January 2023
Sec. Rheumatology
This article is part of the Research Topic Reproducibility and Rigour in Rheumatology View all 5 articles

Reproducibility and rigor in rheumatology research

Updated
  • 1Division of Rheumatology, Department of Medicine, The University of Jordan, Amman, Jordan
  • 2Department of Business Administration, King Talal School of Business Technology, Princess Sumaya University for Technology, Amman, Jordan
  • 3School of Medicine, The University of Jordan, Amman, Jordan
  • 4Division of Pulmonary, Critical Care, Sleep, and Allergy, Department of Medicine, University of Illinois Chicago, Chicago, IL, United States
  • 5Department of Nephrology and Rheumatology, University Medical Center Göttingen, Göttingen, Germany
  • 6Division of Rheumatology, Department of Medicine, University of Illinois Chicago, Chicago, IL, United States

The pillars of scientific progress in rheumatology are experimentation and observation, followed by the publication of reliable and credible results. These data must then be independently verified, validated, and replicated. Peer and journal-specific technical and statistical reviews are paramount to improving rigor and reproducibility. In addition, research integrity, ethics, and responsible conduct training can help to reduce research misconduct and improve scientific evidence. As the number of published articles in rheumatology grows, the field has become critical for determining reproducibility. Prospective, longitudinal, randomized controlled clinical trials are the gold standard for evaluating clinical intervention efficacy and safety in this space. However, their applicability to larger, more representative patient populations with rheumatological disorders worldwide could be limited due to time, technical, and cost constraints involved with large-scale clinical trials. Accordingly, analysis of real-world, patient-centered clinical data retrieved from established healthcare inventories, such as electronic health records, medical billing reports, and disease registries, are increasingly used to report patient outcomes. Unfortunately, it is unknown whether this clinical research paradigm in rheumatology could be deployed in medically underserved regions.

1. Reproducibility and rigor in rheumatology research

In a manuscript presented by invitation at the 1992 Society of Exploration Geophysics (SEG) meeting (1), Claerbout and Karrenbach from Stanford University coined the term “reproducible research.” Claerbout, who pioneered the use of computers to process and filter seismic exploration data, required that Ph.D. dissertations of his students meet reproducibility standards, namely research data that could be independently replicated by others using a single computer command.

However, the terms reproducibility and replicability are not clearly distinguished in the literature. Barba (2) proposed a distinction between these terms. The term “reproducible research” was used when the data and computer codes were required to repeat the analysis and recreate the results. In contrast, the term “replicability” is used when a study reaches the same scientific conclusions as another study despite collecting new data and conducting new analyses.

Over half of researchers polled by Nature indicated that the scientific community was facing a severe reproducibility crisis (3). Richard Smith, a former BMJ editor, stated that it might be time to stop assuming that the research was indeed performed and accurately reported but instead to believe it was fraudulent until evidence to the contrary is presented (4). The pressure to publish new scientific discoveries in top journals significantly contributes to this crisis, particularly for early-career researchers attempting to establish a solid scientific record (5).

The number of publications in all research fields is rapidly increasing; rheumatology is no exception. For example, Cheng and Zhang (6) reported that there had been a threefold increase in the number of articles published in rheumatology over the past 20 years, making this specialty attractive for reproducibility evaluation.

After admitting to 3 years of data manipulation in a phase II study of using omalizumab to treat anti-citrullinated peptide antibody (ACPA)-positive rheumatoid arthritis, a senior rheumatology medicine investigator at a prestigious Dutch university hospital was dismissed (7). Senior academics could not replicate the research findings, and the publications that reported results from this trial were retracted.

Gasparyan et al. (8) searched PubMed, an electronic, publicly-accessible search engine on life sciences and biomedical topics, to assess the scope and magnitude of duplicate and retracted publications in rheumatology. Thirty-seven rheumatology journals ranked by the SCImago Journal Rank (SJR) indicator, a measure of the scientific influence of scholarly journals and listed in PubMed, were chosen. The total number of publications across all countries and the number of duplicate articles correlated significantly. It was found that the proportion of corrections published in 2013 accounted for 39% of all corrections, with 85% of these corrections coming from a single journal (8). Eighty percent of retracted articles were published between January 1, 2000, and December 31, 2013, coinciding with increased open-access publishing. The top three categories of retractions were comparative studies, randomized trials, and reviews, with articles from the United States being the most frequently duplicated and retracted.

2. Implementing reproducibility standards in rheumatology research

Randomized clinical trials (RCTs) are the gold standard for establishing clinical evidence in medicine. The strength and internal validity of the RCT stem from randomization’s ability to ascertain that no differences exist between the two treatment arms beside the administration of the treatment under consideration (9).

The significance of RCTs is contingent upon a transparent and precise results report. The published article must accurately reflect the study protocol, and the statistical plan must be adhered to or formally modified, with substantial justification for any deviations (10). Studies must report negative findings if encountered and not hide them.

The Consolidated Standards of Reporting Trials (CONSORT) (11), published for the first time in 1996, is one of the efforts to improve clinical trial reporting and transparency and provides evidence-based guidelines for reporting randomized trials. Hill et al. (12) compared 121 trials published between 1997 and 1998 to 119 studies published between 1987 and 1988 (before the CONSORT statement). They discovered an improvement in the quality of trial reporting. However, even in high-impact journals, methodological issues persisted (12).

3. Avoiding and identifying fraudulent research in rheumatology

The US National Institutes of Health (NIH) defines responsible conduct of research (RCR) as the practice of conducting scientific investigations with integrity. It applies established ethical and professional standards to all scientific research activities (13).

Responsible conduct and integrity training can reduce research misconduct and enhance scientific evidence (14). Conversely, positive and negative early research experiences can influence researchers’ adherence to ethical standards. Two-thirds of the participants in a survey study of researchers’ views on research integrity in Switzerland admitted they had no formal training in research integrity. They cited ambition and moral compass as the most significant determinants of the significance of ethical research (15). Therefore, in addition to securing research funding, it is necessary to investigate the actual research behavior and the factors influencing responsible research conduct.

Responsible conduct of research training is required for all US investigators funded by NIH and the National Science Foundation (NSF). However, after almost two decades of training, its efficacy in modifying research behavior remains inconsistent and limited (16).

4. Improving rigor of rheumatology research practices

Menke et al. (17) created an automated tool (SciScore) to rate how closely open-access scientific research articles adhere to rigor standards like those established by NIH and Research Resource Identifiers. In addition, the Rigor and Transparency Index (RTI), a yearly average score for journals based on their rigor and transparency, was introduced. Following the RTI’s introduction, studies more frequently meet the rigor criteria, but only about half of these criteria, such as blinding or power analysis, are consistently reported by authors. Interestingly, the RTI did not correlate with the Journal’s Impact Factor (17).

Disseminating published sources widely and contacting knowledgeable readers who can spot minor and major errors are the first steps in preventing misconduct. Research misconduct and the dissemination of false, inaccurate, or misleading information can occur in any publication. However, it has been noted that esteemed journals and periodicals using the open-access publishing model regularly retract or publish corrections (8). Additionally, the time between publication and retraction in journals with higher impact factors is shorter than in journals with lower impact, presumably because readers and authors pay less attention to the latter (18).

Due to the detrimental effects of “paper mills,” maintaining ethical standards and integrity in research practice is also challenging. These unethical organizations are adept at creating phony manuscripts that are then submitted to scholarly journals using plagiarism, fake results, and image falsification. According to one study, retractions of such fraudulent articles are rising (19).

The Rigor and Reproducibility policy of NIH, developed in 2016 in collaboration with the Nature Publishing Group and Science, exemplifies scholarly efforts to raise research integrity standards (20). These principles are based on the core set of standards for transparency in reporting detailed methods and rigorous statistical analyses, with an emphasis on data and material sharing, which states that all datasets on which the manuscript’s conclusions are based must be made available upon request, potentially through the deposition in publicly available repositories. Some journals in rheumatology, such as Arthritis and Rheumatology, require that authors make data and methods associated with the manuscript available to readers promptly without undue restrictions. Furthermore, the International Committee of Medical Journal Editors (ICMJE) formulated guidelines to help authors and editors produce understandable and reproducible medical journal articles (21).

5. Is it time to re-reproduce landmark studies?

Randomized controlled clinical trials are the gold standard for determining the efficacy of an intervention, but their applicability to real-world clinical settings can be limited (22). These trials are tedious, challenging, and expensive. In addition, the study’s inclusion criteria and patient selection can make it challenging to extrapolate the results to larger, more representative “real-world” patient populations.

Real-world data gathered from sources other than traditional clinical research settings, such as electronic health records (EHRs), billing information, and disease registries (22, 23) are increasingly used to supplement the outcomes of conventional clinical trials and provide an essential source for patient-reported outcome (PRO) measurements (24, 25). However, replicating the findings of clinical trials in real-world settings might not always be feasible. For example, a 2017 cross-sectional study (26) found that only 15% of 220 clinical trials published in journals with high-impact factors were replicable using insurance claims or EHRs.

Rheumatology research projects using real-world data have included several rheumatological diagnoses, but the majority of data focuses on rheumatoid arthritis (RA) (24). The data collected by RA registries in various countries (2732) allows for a better understanding of the patient’s disease outcomes, responses to different therapies, particularly with the emerging use of biologics and biosimilars in rheumatology, and safety ofe these agents (33). A similar role of real-world data exists in other rheumatological diseases such as systemic lupus erythematosus, ankylosing spondylitis (AS), psoriatic arthritis (PsA), systemic sclerosis, idiopathic myositis, and vasculitis (24).

Real-world data was also used to develop treatment guidelines, such as the European Society for Clinical and Economic Aspects of Osteoporosis and Osteoarthritis (ESCEO) guidelines for knee osteoarthritis (34). These guidelines included agents such as chondroitin sulfate and glucosamine, which both lack RCT evidence for their use but have shown some efficacy in improving pain and function in real-life studies (34).

The Rheumatology Informatics System for Effectiveness (RISE) (35) is an extensive data registry created by the American College of Rheumatology (ACR) that includes over 1,000 rheumatology clinicians and 2.5 million patients. With the introduction of the RISE Pilot Project Award (36) by ACR’s RISE Registry and Rheumatology Research Foundation in 2022, early-career researchers and clinicians interested in conducting rheumatology research using real-world EHRs data now have access to the RISE registry.

Despite significant legal, logistical, and methodological challenges (25), the Nordic countries’ experience with collaboration across large population-based clinical rheumatology registries enabled the production of studies involving a large number of patients with inflammatory arthritis from different countries. Several projects are currently being performed based on this registry (37). Examples of large-scale observational studies from this project are a study of biologics used in 42,638 RA patients with a history of malignancy (38) and another study to assess the risk of neuroinflammatory events in 25,796 RA patients, 8,586 PsA patients, and 9,527 AS patients treated with TNF-alpha inhibitors (39).

Population-based observational studies provide insight into routine care delivery to a larger number of patients, including the elderly and those with comorbidities, and provide data on real-life long-term outcomes. Still, they have significant limitations. Certain difficulties are associated with using registry data (40). These studies have limited internal validity and can be potentially biased by treatment indications or practice changes (9).

Registries may not record longitudinal data, and data may not be readily accessible; therefore, they may not keep pace with clinical practice. Because registries are frequently limited to the specific indication they are intended to record, registry data may not accurately reflect the typical clinical application of a given drug or medical device. For instance, registry patients’ medication use is typically recorded as taken or not taken, but no comments can be made on a dose’s long-term effects. For example, registers’ patient-reported outcome measures may not capture all potential outcomes (40).

Investigators have less incentive in terms of return on investment. Therefore, sites spend less time with a patient registry, and researchers’ motivation often drives the registration process, particularly in academic institutions. Adequate compensation for data entry or an on-site researcher with prior competence may improve the rigor of data registration.

Missing data is common in patient registries and non-experimental studies that observe routine patient care (41). Imputing methods use model-predicted values to retain patients with missing data. Multiple imputations generate multiple data sets, each with a different imputed value for each missing variable, reflecting the uncertainty surrounding missing variable values. In contrast, in single imputation, the missing observation can be replaced with the sample mean or median, a predicted variable value (e.g., from a regression model), or even a study patient who matches the missing data on a set of chosen covariables. Multiple imputations outperform single imputations and are unlikely to introduce estimation bias because they use a robust model of missing data with good covariate (42).

Researchers must report potential biases from single imputations as well as technical limitations.

The European Medicines Agency (EMA) established the Patient Registry Initiative in 2015 to improve the utilization of patient registries and support the definition of study populations and study protocols with guidance on data gathering, data quality management, and data analysis (43).

6. Economic impact of irreproducibility in rheumatology research

The economic impact of reproducibility in biomedical research is enormous. Previous studies have shown that the prevalence of irreproducible preclinical research in the US alone exceeds 50%, with ∼28 billion US dollars spent yearly on preclinical research that is not reproducible (44). These low reproducibility rates erode new knowledge accumulation and contribute to appreciable delays and costs of drug development for patients in need. Thus, ignoring the lack of reproducibility in rheumatology research is both important and costly. Addressing this “reproducibility crisis” requires stakeholders to conduct unbiased, multi-faceted, root-cause analysis of irreproducible studies in this space that includes, but is not limited to, review of study design and data collection, analysis, and interpretation and proposing corrective measures to prevent recurrences.

7. Perspective and concluding remarks

The medical community is suffering from the COVID-19 pandemic crisis, and the rate of physician burnout has increased significantly. Many talented young investigators changed their career pathways and left academic medicine. There is a national and international shortage of physician-scientists in rheumatology and other specialties. The additional impact of the lack of rigor and reproducibility in rheumatology causes a threat to the future of academic rheumatology, medical education, and valid research which will ultimately impact the patient’s well-being and health. Providing outstanding scientific and evidence-based patient care remains our utmost goal in the field of rheumatology. A national effort supported by multiple stakeholders in academic medicine is urgently required. Standardizing the definitions of integrity, ethics, and rigor is a step forward in the right direction.

Implementing rigorous peer and journal-specific technical and statistical reviews of submitted manuscripts following reproducibility criteria, such as the Rigor and Reproducibility policy of the NIH can assist in improving rigor and reproducibility by removing major causes of irreproducibility, such as methodological flaws and inadequate or inaccurate reporting, and counteract the harmful effect of irreproducibility.

Research integrity, ethics, and responsible conduct training can reduce unethical research and improve scientific evidence. By adhering to the rigor guidelines and including a clear description in submitted works, the scientific community can more easily replicate or invalidate research findings. Real-world data makes it possible to replicate landmark studies and add patient outcomes in underrepresented populations. Whether this clinical research paradigm in rheumatology and other disciplines of medicine could be deployed in medically underserved regions of the world remains to be determined.

Data availability statement

The original contributions presented in this study are included in the article/supplementary material, further inquiries can be directed to the corresponding author.

Author contributions

FA, NS (2nd author), JS, and NS (7th author) searched the literature, drafted and revised the manuscript. CA, PK, and IR critically revised and edited the manuscript. All authors contributed to the article and approved the submitted version.

Funding

This work was supported by a gracious donation from Professor Robert Barish the vice chancellor for health affairs at the University of Illinois Chicago.

Conflict of interest

The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

Publisher’s note

All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article, or claim that may be made by its manufacturer, is not guaranteed or endorsed by the publisher.

References

1. Claerbout J, Karrenbach M. Electronic documents give reproducible research a new meaning. In: SEG Technical Program Expanded Abstracts 1992. Houston, TX: Society of Exploration Geophysicists (1992). p. 601–4. doi: 10.1190/1.1822162

CrossRef Full Text | Google Scholar

2. Barba L. Terminologies for reproducible research. arXiv [Preprint]. (2018).

Google Scholar

3. Baker M. 1,500 scientists lift the lid on reproducibility. Nature. (2016) 533:452–4. doi: 10.1038/533452a

PubMed Abstract | CrossRef Full Text | Google Scholar

4. Smith R. Time to Assume That Health Research is Fraudulent Until Proven Otherwise? BMJ Opinion. (2021). Available at: https://blogs.bmj.com/bmj/2021/07/05/time-to-assume-that-health-research-is-fraudulent-until-proved-otherwise/ (accessed December 22, 2022).

Google Scholar

5. National Academies of Sciences, Engineering, and Medicine. Reproducibility and Replicability in Science. Washington, DC: National Academies Press (2019).

Google Scholar

6. Cheng T, Zhang G. Worldwide research productivity in the field of rheumatology from 1996 to 2010: a bibliometric analysis. Rheumatology. (2013) 52:1630–4. doi: 10.1093/rheumatology/ket008

PubMed Abstract | CrossRef Full Text | Google Scholar

7. Sheldon T. Senior Dutch researcher sacked for manipulating data in rheumatoid arthritis drug trial. BMJ. (2013) 347:f5267. doi: 10.1136/bmj.f5267

PubMed Abstract | CrossRef Full Text | Google Scholar

8. Gasparyan A, Ayvazyan L, Akazhanov N, Kitas G. Self-correction in biomedical publications and the scientific impact. Croat Med J. (2014) 55:61. doi: 10.3325/cmj.2014.55.61

PubMed Abstract | CrossRef Full Text | Google Scholar

9. Booth C, Tannock I. Randomised controlled trials and population-based observational research: partners in the evolution of medical evidence. Br J Cancer. (2014) 110:551–5. doi: 10.1038/bjc.2013.725

PubMed Abstract | CrossRef Full Text | Google Scholar

10. Bauchner H, Golub R, Fontanarosa P. Reporting and interpretation of randomized clinical trials. JAMA. (2019) 322:732–5. doi: 10.1001/jama.2019.12056

PubMed Abstract | CrossRef Full Text | Google Scholar

11. CONSORT. Welcome to the CONSORT Website. (2022). Available online at: https://www.consort-statement.org/ (accessed October 14, 2022).

Google Scholar

12. Hill C, LaValley M, Felson D. Secular changes in the quality of published randomized clinical trials in rheumatology. Arthritis Rheum. (2002) 46:779–84. doi: 10.1002/art.512

PubMed Abstract | CrossRef Full Text | Google Scholar

13. NIH. NOT-OD-22-055: FY 2022 Updated Guidance: Requirement for Instruction in the Responsible Conduct of Research. (2022). Available online at: https://grants.nih.gov/grants/guide/notice-files/NOT-OD-22-055.html (accessed October 14, 2022).

Google Scholar

14. Steneck N, Bulger R. The history, purpose, and future of instruction in the responsible conduct of research. Acad Med. (2007) 82:829–34. doi: 10.1097/ACM.0b013e31812f7d4d

PubMed Abstract | CrossRef Full Text | Google Scholar

15. Satalkar P, Shaw D. How do researchers acquire and develop notions of research integrity? A qualitative study among biomedical researchers in Switzerland. BMC Med Ethics. (2019) 20:72. doi: 10.1186/s12910-019-0410-x

PubMed Abstract | CrossRef Full Text | Google Scholar

16. Phillips T, Nestor F, Beach G, Heitman E. America COMPETES at 5 years: an analysis of research-intensive universities’ RCR training plans. Sci Eng Ethics. (2018) 24:227–49. doi: 10.1007/s11948-017-9883-5

PubMed Abstract | CrossRef Full Text | Google Scholar

17. Menke J, Roelandse M, Ozyurt B, Martone M, Bandrowski A. The rigor and transparency index quality metric for assessing biological and medical science methods. iScience. (2020) 23:101698. doi: 10.1016/j.isci.2020.101698

PubMed Abstract | CrossRef Full Text | Google Scholar

18. Steen R, Casadevall A, Fang F. Why has the number of scientific retractions increased? In: Kazdin AE, editor. Methodological Issues and Strategies in Clinical Research. Washington, DC: American Psychological Association (2016). p. 557–68. doi: 10.1037/14805-034

CrossRef Full Text | Google Scholar

19. Pérez-Neri I, Pineda C, Sandoval H. Threats to scholarly research integrity arising from paper mills: a rapid scoping review. Clin Rheumatol. (2022) 41:2241–8. doi: 10.1007/s10067-022-06198-9

PubMed Abstract | CrossRef Full Text | Google Scholar

20. NIH. Principles and Guidelines for Reporting Preclinical Research. (2022). Available online at: https://www.nih.gov/research-training/rigor-reproducibility/principles-guidelines-reporting-preclinical-research (accessed October 14, 2022).

Google Scholar

21. ICMJE. Recommendations. (2022). Available online at: https://www.icmje.org/recommendations/ (accessed October 14, 2022).

Google Scholar

22. Sherman R, Anderson S, Dal Pan G, Gray G, Gross T, Hunter N, et al. Real-world evidence—what is it and what can it tell us? N Engl J Med. (2016) 375:2293–7. doi: 10.1056/NEJMsb1609216

PubMed Abstract | CrossRef Full Text | Google Scholar

23. FDA. Use of Real-World Evidence to Support Regulatory Decision-Making for Medical Devices 1 Guidance for Industry and Food and Drug Administration Staff. (2022). Available online at: https://www.fda.gov/media/99447/download (accessed October 14, 2022).

Google Scholar

24. Misra D, Agarwal V. Real-world evidence in rheumatic diseases: relevance and lessons learnt. Rheumatol Int. (2019) 39:403–16. doi: 10.1007/s00296-019-04248-1

PubMed Abstract | CrossRef Full Text | Google Scholar

25. Chatzidionysiou K, Hetland M, Frisell T, Di Giuseppe D, Hellgren K, Glintborg B, et al. Opportunities and challenges for real-world studies on chronic inflammatory joint diseases through data enrichment and collaboration between national registers: the Nordic example. RMD Open. (2018) 4:e000655. doi: 10.1136/rmdopen-2018-000655

PubMed Abstract | CrossRef Full Text | Google Scholar

26. Bartlett V, Dhruva S, Shah N, Ryan P, Ross J. Feasibility of using real-world data to replicate clinical trial evidence. JAMA Netw Open. (2019) 2:e1912869. doi: 10.1001/jamanetworkopen.2019.12869

PubMed Abstract | CrossRef Full Text | Google Scholar

27. Favalli E, Becciolini A, Biggioggero M, Marchesoni A, Meroni P. Is there a need for new thresholds to define remission and low disease activity by disease activity score 28 calculated with C reactive protein? Real life data from a local registry. Ann Rheum Dis. (2015) 74:e5. doi: 10.1136/annrheumdis-2014-206651

PubMed Abstract | CrossRef Full Text | Google Scholar

28. Kaltsonoudis E, Pelechas E, Voulgari PV, Drosos A. Unmet needs in the treatment of rheumatoid arthritis. An observational study and a real-life experience from a single university center. Semin Arthritis Rheum. (2019) 48:597–602. doi: 10.1016/j.semarthrit.2018.06.003

PubMed Abstract | CrossRef Full Text | Google Scholar

29. Norvang V, Sexton J, Kristianslund E, Olsen I, Uhlig T, Bakland G, et al. Predicting achievement of the treatment targets at 6 months from 3-month response levels in rheumatoid arthritis: data from real-life follow-up in the NOR-DMARD study. RMD Open. (2018) 4:e000773. doi: 10.1136/rmdopen-2018-000773

PubMed Abstract | CrossRef Full Text | Google Scholar

30. Gremese E, Salaffi F, Bosello S, Ciapetti A, Bobbio-Pallavicini F, Caporali R, et al. Very early rheumatoid arthritis as a predictor of remission: a multicentre real life prospective study. Ann Rheum Dis. (2013) 72:858–62. doi: 10.1136/annrheumdis-2012-201456

PubMed Abstract | CrossRef Full Text | Google Scholar

31. Vieira-Sousa E, Eusébio M, Ávila-Ribeiro P, Khmelinskii N, Cruz-Machado R, Rocha T, et al. Real-world longterm effectiveness of tumor necrosis factor inhibitors in psoriatic arthritis patients from the rheumatic diseases Portuguese register. J Rheumatol. (2020) 47:690–700. doi: 10.3899/jrheum.181272

PubMed Abstract | CrossRef Full Text | Google Scholar

32. Bajema I, Bruijn J, Casian A, Cid M, Csernok E, Van Daalen E, et al. The European vasculitis society 2016 meeting report. Kidney Int Rep. (2017) 2:1018–31. doi: 10.1016/j.ekir.2017.09.008

PubMed Abstract | CrossRef Full Text | Google Scholar

33. van der Leeuw M, Messelink M, Tekstra J, Medina O, van Laar J, Haitjema S, et al. Using real-world data to dynamically predict flares during tapering of biological DMARDs in rheumatoid arthritis: development, validation, and potential impact of prediction-aided decisions. Arthritis Res Ther. (2022) 24:1–11. doi: 10.1186/s13075-022-02751-8

PubMed Abstract | CrossRef Full Text | Google Scholar

34. Bruyère O, Honvo G, Veronese N, Arden N, Branco J, Curtis E, et al. An updated algorithm recommendation for the management of knee osteoarthritis from the European Society for Clinical and Economic Aspects of Osteoporosis, Osteoarthritis and Musculoskeletal Diseases (ESCEO). Semin Arthritis Rheum. (2019) 49:337–50. doi: 10.1016/j.semarthrit.2019.04.008

PubMed Abstract | CrossRef Full Text | Google Scholar

35. American College of Rheumatology. RISE Registry. (2022). Available online at: https://www.rheumatology.org/Practice-Quality/RISE-Registry (accessed October 14, 2022).

Google Scholar

36. The American 316 College of Rheumatology [ACR]. RISE-Pilot-Project-Award, Rheumatology Research Foundation. (2022). Available online at: https://www.rheumresearch.org/file/awards/education-and-training/RISE-Pilot-Project-Award---Final-RFA.pdf (accessed 2022).

Google Scholar

37. FOREUM. Pan-Nordic RA Register Network. (2022). Available online at: https://foreum.org/projects.cfm?projectid=116 (accessed October 14, 2022).

Google Scholar

38. Chatzidionysiou K, Delcoigne B, Frisell T, Hetland M, Glintborg B, Cordtz R, et al. How do we use biologics in rheumatoid arthritis patients with a history of malignancy? An assessment of treatment patterns using Scandinavian registers. RMD Open. (2020) 6:e001363. doi: 10.1136/rmdopen-2020-001363

PubMed Abstract | CrossRef Full Text | Google Scholar

39. Kopp T, Delcoigne B, Arkema EV, Jacobsen R, Magyari M, Ibfelt E, et al. Risk of neuroinflammatory events in arthritis patients treated with tumour necrosis factor alpha inhibitors: a collaborative population-based cohort study from Denmark and Sweden. Ann Rheum Dis. (2020) 79:566–72. doi: 10.1136/annrheumdis-2019-216693

PubMed Abstract | CrossRef Full Text | Google Scholar

40. Li G, Sajobi T, Menon B, Korngut L, Lowerison M, James M, et al. Registry-based randomized controlled trials- what are the advantages, challenges, and areas for future research? J Clin Epidemiol. (2016) 80:16–24. doi: 10.1016/j.jclinepi.2016.08.003

PubMed Abstract | CrossRef Full Text | Google Scholar

41. Mack C, Su Z, Westreich D. Managing Missing Data in Patient Registries: Addendum to Registries for Evaluating Patient Outcomes: A User’s Guide. 3rd ed. Rockville, MD: Agency for Healthcare Research and Quality (2018). doi: 10.23970/AHRQREGISTRIESMISSINGDATA

CrossRef Full Text | Google Scholar

42. Chang M. Modern issues and methods in biostatistics. In: Gail M, Krickeberg K, Samet J, Tsiatis A, Wong W, editors. Statistics for Biology and Health. New York, NY: Springer (2011). doi: 10.1007/978-1-4419-9842-2

CrossRef Full Text | Google Scholar

43. EMA. Patient Registries. (2022). Available online at: https://www.ema.europa.eu/en/human-regulatory/post-authorisation/patient-registries (accessed December 14, 2022).

Google Scholar

44. Freedman L, Cockburn I, Simcoe T. The economics of reproducibility in preclinical research. PLoS Biol. (2015) 13:e1002165. doi: 10.1371/journal.pbio.1002165

PubMed Abstract | CrossRef Full Text | Google Scholar

Keywords: clinical trials, real-world data, registry, ethics, responsible conduct, research integrity

Citation: Alnaimat F, Sweis NJ, Sweis JJG, Ascoli C, Korsten P, Rubinstein I and Sweiss NJ (2023) Reproducibility and rigor in rheumatology research. Front. Med. 9:1073551. doi: 10.3389/fmed.2022.1073551

Received: 18 October 2022; Accepted: 15 December 2022;
Published: 06 January 2023.

Edited by:

João Eurico Fonseca, University of Lisbon, Portugal

Reviewed by:

Ioannis Parodis, Karolinska Institutet (KI), Sweden

Copyright © 2023 Alnaimat, Sweis, Sweis, Ascoli, Korsten, Rubinstein and Sweiss. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

*Correspondence: Nadera J. Sweiss, www.frontiersin.org bnN3ZWlzc0B1aWMuZWR1

Disclaimer: All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article or claim that may be made by its manufacturer is not guaranteed or endorsed by the publisher.