Skip to main content

EDITORIAL article

Front. Med., 11 July 2022
Sec. Regulatory Science
This article is part of the Research Topic The Process Evaluation of Clinical Trials View all 10 articles

Editorial: The Process Evaluation of Clinical Trials

  • 1Center Health Systems Science, The George Institute for Global Health, The University of New South Wales, Newtown, NSW, Australia
  • 2School of Public Health, University of Sydney, Sydney, NSW, Australia
  • 3Department of Behavioral Sciences and Learning, Department of Biomedical and Clinical Sciences, Linköping University, Linköping, Sweden
  • 4Division of Psychiatry, Department of Clinical Neuroscience, Karolinska Institute, Stockholm, Sweden
  • 5Department of Otolaryngology–Head and Neck Surgery, University of Colorado School of Medicine, Aurora, CO, United States
  • 6UCHealth Hearing and Balance, University of Colorado Hospital, Aurora, CO, United States
  • 7Virtual Hearing Lab, Collaborative Initiative Between University of Colorado School of Medicine and University of Pretoria, Aurora, CO, United States
  • 8Department of Speech-Language Pathology and Audiology, University of Pretoria, Pretoria, South Africa
  • 9Department of Speech and Hearing, Manipal College of Health Professions, Manipal Academy of Higher Education, Manipal, India

Editorial on the Research Topic
The Process Evaluation of Clinical Trials

Background: History of Process Evaluations

The history of clinical trials goes back a long way to 500 BC, though some credits French surgeon Ambrose pare for the first documented clinical trial involving treatment of wounds during sixteenth century (1). Since then, clinical trials have evolved tremendously and have now become the foundation of modern medical and healthcare practice, focusing on clinical outcomes. However, over the past decades there has been increasing interest in performing “process evaluations” of clinical trials of complex interventions (2). While the outcome evaluation focuses on whether a new intervention works, a process evaluation supplements our knowledge by providing an understanding of the causal mechanisms of the intervention, contextual factors, and implementation factors impacting on the outcomes (3).

Process evaluation methodology has evolved through the years (2). Previously, they were used to assess implementation through the analysis of quantitative process indicators. Subsequently, there was increasing recognition and the need for qualitative research alongside trials to provide a deeper understanding of the disease condition, acceptability of an intervention and implementation issues (4). Process evaluations were deemed particularly relevant during a negative trial result, as to whether there was either implementation or intervention failure, or both. However, there is also a growing recognition that using qualitative and quantitative data, and theoretical frameworks within process evaluations will help facilitate evidence to practice (57). Process evaluations can help address stakeholders' question of “Is this intervention acceptable, effective, affordable and feasible (for me or) for this population?” (7).

Key domains are summarized in UK Medical Research Council (MRC) process evaluation guidance (context, quality of implementation and mechanisms of the intervention), and also include concepts from established evaluation frameworks that have been used widely including: Reach, Effectiveness, Adoption, Implementation and Maintenance framework (RE-AIM) (8) and Linnan and Steckler (9). Although each is unique, there is some overlap, in their emphasis to enable research translation. The key concepts include: (i) reach and recruitment (i.e., investigating the extent to which the intervention as received by the targeted group), (ii) adoption (i.e., related to the delivery of the intervention, (iii) acceptability (i.e., extent to which participants find the intervention acceptable), (iv) implementation fidelity (i.e., extent to which intervention is delivered as planned), (v) maintenance (i.e., extent to which the intervention can be sustained over time after the clinical trial is over).

Special Issue: Process Evaluations of Clinical Trials

This special issue builds on the emerging value and methodology of process evaluations. It includes nine manuscripts focusing on a range of interventions. Therefore, highlighting the transferability and value of process evaluations across types of interventions, and also in unpacking context from lower-middle income countries to high income countries with established health systems. Chu et al. presented the mixed-methods process evaluation of community-based dietary sodium reduction in Rural China. In another study, expectations regarding pragmatic trial design of integrative medicine for diabetes and kidney diseases among patients and physicians was evaluated and reported (Chan et al.). Four studies focused on process evaluation of telehealth interventions. Meijerink et al. presented process evaluation of online support program for hearing aid users. Beukes et al. and Biliunaite et al. provided process evaluation results of internet-based cognitive behavioral therapy for tinnitus and informal caregivers, respectively. Indraratna et al. presented the process evaluation of TeleClinical care for acute coronary syndrome and heart failure. Two studies also included implementation science approach. Riddell et al. evaluated the implementation and scalability of the Accredited Social Health Activists (ASHAs) led community-based support groups for hypertension in Rural India. In another study, Ouyang et al. provided the process evaluation of implementation trial on intracerebral hemorrhage. Finally, Wu et al. presented the comprehensive process evaluation of the pediatric drug clinical trials through a literature review.

The process evaluations in this collection are also conducted across different phases of the research cycle, from study design (Chu et al.), pilot/feasibility phase (Biliunaite et al.; Indraratna et al.), evaluation of the clinical trial (Chu et al.; Meijerink et al.; Beukes et al.; Riddell et al.; Ouyang et al.) including long term sustainability (Riddell et al.; Wu et al.). Therefore, highlighting the value of process evaluation findings to inform intervention design and optimize implementation. Moreover, while the use of theoretical frameworks is helpful in eliciting contextual determinants across individual, organizational and system, and policy levels, often the breadth and scope of them in literature can be daunting (3). Careful consideration of what theories are relevant would be helpful (10). For instance, in this special issue, for interventions that are related to individual behavioral change, health belief model used by Chu et al. or others such as behavior change wheel, or cognitive theories may be helpful. Normalization process theory that has a strong focus on understanding organizational behavior, was also used by Ouyang et al. the implementation study in stroke units for intracerebral hemorrhage.

Where to From Here?

Indeed, as we reflect on the emerging value and methodology of process evaluations, it is worth noting its contributions to implementation science, as researchers endeavor to meet end-users' needs, understand what happened on the ground, and how to overcome implementation barriers. As we continue to invest in clinical trials to inform evidence-based medicine and policy, we recommend that we embed process evaluations throughout the research cycle, to examine for whom, how and why the clinical trial had its outcomes. This will require building capacity in mixed-methods, implementation science, stakeholder engagement/co-design of implementation strategies, which will require allocation of sufficient resourcing, budgeting, time and most importantly training those who are involved in performing clinical trials on process evaluation and implementation science elements. And in doing so, regardless of a positive or negative trial result, we will learn to improve our research and intervention design to meet local context and enable long term sustainability and scale up of effective interventions.

Author Contributions

HL drafted the initial manuscript, with significant input from VM. All authors have reviewed the manuscript versions and approved its submission.

Funding

HL is supported by a Program Grant Fellowship from The George Institute for Global Health.

Conflict of Interest

The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

Publisher's Note

All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article, or claim that may be made by its manufacturer, is not guaranteed or endorsed by the publisher.

Acknowledgments

We thank all the authors of the manuscripts in this special Research Topic for their interest and support of this collection. Special thanks to the editors, and reviewers of these manuscripts that cover a diverse content area.

References

1. Bhatt A. Evolution of clinical research: a history before and beyond James Lind. Perspect Clin Res. (2010) 1:6–10.

PubMed Abstract | Google Scholar

2. Moore GF, Audrey S, Barker M, Bond L, Bonell C, Hardeman W, et al. Process evaluation of complex interventions: Medical Research Council guidance. BMJ. (2015) 350:h1258. doi: 10.1136/bmj.h1258

PubMed Abstract | CrossRef Full Text | Google Scholar

3. Liu H, Mohammed A, Shanthosh J, News M, Laba TL, Hackett ML, et al. Process evaluations of primary care interventions addressing chronic disease: a systematic review. BMJ Open. (2019) 9:e025127. doi: 10.1136/bmjopen-2018-025127

PubMed Abstract | CrossRef Full Text | Google Scholar

4. Lewin S, Glenton C, Oxman AD. Use of qualitative methods alongside randomised controlled trials of complex healthcare interventions: methodological study. BMJ. (2009) 339:b3496. doi: 10.1136/bmj.b3496

PubMed Abstract | CrossRef Full Text | Google Scholar

5. Curry LA, Nembhard IM, Bradley EH. Qualitative and mixed methods provide unique contributions to outcomes research. Circulation. (2009) 119:1442–52. doi: 10.1161/CIRCULATIONAHA.107.742775

PubMed Abstract | CrossRef Full Text | Google Scholar

6. McIlvennan CK, Morris MA, Guetterman TC, Matlock DD, Curry L. Qualitative methodology in cardiovascular outcomes research: a contemporary look. Circ Cardiovasc Qual Outcomes. (2019) 12:e005828. doi: 10.1161/CIRCOUTCOMES.119.005828

PubMed Abstract | CrossRef Full Text | Google Scholar

7. Skivington K, Matthews L, Simpson SA, Craig P, Baird J, Blazeby JM, et al. A new framework for developing and evaluating complex interventions: update of Medical Research Council guidance. BMJ. (2021) 374:n2061. doi: 10.1136/bmj.n2061

PubMed Abstract | CrossRef Full Text | Google Scholar

8. Glasgow RE, Vogt TM, Boles SM. Evaluating the public health impact of health promotion interventions: the RE-AIM framework. Am J Public Health. (1999) 89:1322–7. doi: 10.2105/AJPH.89.9.1322

PubMed Abstract | CrossRef Full Text | Google Scholar

9. Linnan L, Steckler A. Process Evaluation for Public Health Interventions and Research. San Francisco, CA: Jossey-Bass (2002).

Google Scholar

10. Nilsen P. Making sense of implementation theories, models and frameworks. Implement Sci. (2015) 10:53. doi: 10.1186/s13012-015-0242-0

PubMed Abstract | CrossRef Full Text | Google Scholar

Keywords: clinical trials, process evaluation, qualitative, mixed-methods, randomized controlled trial, implementation science, implementation research

Citation: Liu H, Andersson G and Manchaiah V (2022) Editorial: The Process Evaluation of Clinical Trials. Front. Med. 9:950637. doi: 10.3389/fmed.2022.950637

Received: 23 May 2022; Accepted: 06 June 2022;
Published: 11 July 2022.

Edited and reviewed by: Sandor Kerpel-Fronius, Semmelweis University, Hungary

Copyright © 2022 Liu, Andersson and Manchaiah. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

*Correspondence: Hueiming Liu, hliu@georgeinstitute.org.au

Disclaimer: All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article or claim that may be made by its manufacturer is not guaranteed or endorsed by the publisher.