- 1Cooperative Institute for Research in Environmental Sciences, University of Colorado Boulder, Boulder, CO, United States
- 2National Snow and Ice Data Center, University of Colorado Boulder, Boulder, CO, United States
- 3Laboratoire des Sciences du Climat et de l’Environnement, UMR 8212 CEA-CNRS-UVSQ, Université Paris-Saclay, IPSL, Gif-sur-Yvette, France
- 4Institut des Sciences de la Mer (ISMER), Université du Québec à Rimouski (UQAR), Rimouski, QC, Canada
- 5National Meteorological Service (Argentina), Buenos Aires, Argentina
- 6Young Earth System Scientists (YESS), Buenos Aires City, Argentina
- 7Natural Sciences - Geology, Northwest Missouri State University, Maryville, MO, United States
- 8Earth, Environmental, and Planetary Sciences, Case Western Reserve University, Cleveland, OH, United States
- 9Department of Physics and Astronomy, University of Nigeria, Nsukka, Nigeria
- 10State Key Laboratory of Grasslands & Agroecosystem, Lanzhou University, Lanzhou, China
- 11PODA-Pakistan, Islamabad, Pakistan
- 12Tunisia Polytechnic School, University of Carthage, La Marsa, Tunisia
- 13Institut des Géosciences de l’Environnement, CNRS/UGA/Grenoble-INP/IRD/INRAE, Saint-Martin d’Hères, France
- 14Carrera de Biología Marina, Universidad Científica del Sur, Lima, Peru
- 15B.Geos, Korneuburg, Austria
- 16Laboratoire d'Océanographie et du Climat Expérimentations et Approches Numériques, LOCEAN/IPSL, UPMC-CNRS-IRD-MNHN, Sorbonne Université, Paris, France
- 17Institut des Géosciences de l'Environnement, CNRS / UGA / Grenoble-INP/IRD/INRAE, Saint-Martin d'Hères, France
- 18Department of Biology, McMaster University, Hamilton, ON, Canada
- 19Department of Science, Universidad Nacional Agraria La Molina, Lima, Perú
- 20Department of Geoscience, Aarhus University, Aarhus, Denmark
- 21National Centre for Polar and Ocean Research, Ministry of Earth Sciences, Goa, India
- 22Faculty of African Postgraduate Studies, Natural Resources, Cairo University, Giza, Egypt
- 23Marine and Environmental Sciences Centre, Department of Life Sciences, University of Coimbra, Coimbra, Portugal
- 24British Antarctic Survey, Natural Environment Research Council, Cambridge, United Kingdom
- 25Arctic Basecamp Foundation, Rotterdam, Netherlands
- 26Department of Geography, Geoinformatics and Meteorology, University of Pretoria, Pretoria, South Africa
- 27Stable Light Isotope Laboratory, Department of Archaeology, University of Cape Town, Cape Town, South Africa
- 28Node for Isotope Biogeochemistry, South African Biogeochemistry Research Infrastructure Platform, Cape Town, South Africa
- 29Escuela de Ciencias del Mar, Pontificia Universidad Catolica de Valparaiso, Valparaíso, Chile
- 30Arctic Biogeochemistry and Ecology, National Centre for Polar and Ocean Research, Ministry of Earth Sciences, Goa, India
- 31Institute for Risk and Disaster Reduction, University College London, London, United Kingdom
- 32Atmosphere and Climate (ATMOS), The Climate and Environmental Research Institute (NILU), Kjeller, Norway
- 33Remote Sensing Services, Svalbard Integrated Arctic Earth Observing System (SIOS), Longyearbyen, Norway
- 34Department of Civil Engineering, Manipal Academy of Higher Education (MAHE), Manipal, India
- 35Climate of High Northern Latitudes Research Group, Nansen International Environmental and Remote Sensing Centre, Saint Petersburg, Russia
- 36Institute of Earth Sciences, Saint Petersburg State University, Saint Petersburg, Russia
- 37Meteorology and Oceanography Centre (Pacific), Royal Canadian Navy, Esquimalt, BC, Canada
- 38Climate Lab, School of Earth and Ocean Sciences, University of Victoria, Victoria, BC, Canada
- 39University of Miami, Cooperative Institute for Marine and Atmospheric Studies (CIMAS), National Oceanic and Atmospheric Administration, AOML, Miami, FL, United States
- 40CIMAS, RSMAS, University of Miami, Miami, FL, United States
- 41Department of Computer Science, Harvey Mudd College, Claremont, CA, United States
- 42Geography, Olabisi Onabanjo University, Ago-Iwoye, Nigeria
- 43Observational Seismology Group, Institute of Seismological Research, Gandhinagar, India
- 44Seismology and Geosciences Division, Ministry of Earth Sciences, New Delhi, India
- 45Radiocarbon Laboratory (LAC-UFF), Department of Geochemistry, Universidade Federal Fluminense, Niterói, Brazil
- 46Princess Margaret Cancer Centre, Drug Development Program - Phase 1, University Health Network, Toronto, ON, Canada
- 47Department of Business Studies, Uppsala University, Uppsala, Sweden
- 48Department of Management and Organization, Stockholm School of Economics, Stockholm, Sweden
- 49Departamento de Ingeniería Hidráulica y Ambiental, Pontificia Universidad Católica de Chile, Santiago, Chile
- 50PermaChile Network, Santiago, Chile
- 51Applied Functional Ecology, Department of Plant Biology and Ecology, Universidad de Sevilla, Sevilla, Spain
- 52Bacteria Physiology and Genetics, Department of Life Sciences, Université de Liège, Liège, Belgium
- 53Protistology and Aquatic Ecology, Department of Biology, Universiteit Gent, Ghent, Belgium
- 54Catchment to Coast, Soil and Land Use, UK Centre for Ecology and Hydrology, Bangor, United Kingdom
- 55Permafrost Section, Alfred Wegener Institute Helmholtz Centre for Polar and Marine Research, Potsdam, Germany
- 56Institute of Geosciences, University of Potsdam, Potsdam, Germany
The participation of a diverse –in terms of geography, discipline and gender– group of Early Career Researchers (ECRs) in the peer review process can help alleviate the workload of senior researchers and counteract the perceptual biases that the latter tend to show. Moreover, ECRs can benefit from developing skills that are often not included in educational programs. From 2018 to 2021, the Association of Polar Early Career Scientists, in collaboration with other associations, organized six group reviews of the Intergovernmental Panel on Climate Change (IPCC) reports by a total of more than 600 ECRs from over 70 different countries. This study aims to evaluate this group review in terms of its contribution to the production of scientific knowledge, and as a career development opportunity for ECRs. The data analyzed consists of application forms, review comments, and feedback surveys that were collected during each review process. The results of this study show that, overall, the group reviews were a success in terms of the experience of ECRs and their contribution to the peer review of the IPCC reports. Most survey respondents considered the general organization of the group reviews satisfactory and expressed interest in participating in future group reviews. However, most participants did not engage in discussions with their peers, which constitutes a missed opportunity to engage in active learning and the shared production of knowledge. ECRs made a significant contribution to the review of the IPCC reports by producing an average of 2,422 ± 532 comments per group review, 36% of which were substantive. PhD students were shown to be as proficient reviewers as postdoctoral researchers and faculty reviewers. More importantly, the diversity of reviewers in terms of geography and discipline, together with the fact that they are ECRs, can help produce more balanced scientific reports since they bring new perspectives, thus counteracting the biases that senior researchers have. These group reviews could be improved by providing more comprehensive training and facilitating communication among reviewers so that they can engage in meaningful exchanges. We conclude that the IPCC should formalize the inclusion of ECRs in future reviews of the IPCC reports.
1 Introduction
The Intergovernmental Panel on Climate Change (IPCC) was created in 1988 by the World Meteorological Organization and the United Nations Environment Program to assess the state of most advanced knowledge related to climate change. This is achieved through a cycle of Assessment Reports (ARs) of several volumes alongside special reports. For the 6th cycle, the Assessment Report (AR6) was composed of three volumes and a synthesis one, and three special reports were produced. A crucial step of the IPCC AR cycle is its review process, which is transparent, conducted by experts and governments from around the world to ensure an objective and comprehensive assessment of climate change that reflects a wide range of views and expertise. The peer review exercise plays an important role in scientific research as an integral part of the publication process. It contributes to the progress of science (Gropp et al., 2017) by providing useful critiques, constructive feedback, and insights that help authors and reviewers refine their thinking, sharpen their arguments and develop effective communication skills, maintaining the quality standards of scientific publications (Cantor and Gero, 2015; Jana, 2019; Matsui et al., 2021). Through the review process of the IPCC ARs, both representatives of governments and the entire scientific community, including Early Career Researchers (ECRs), are invited to participate.
ECRs stand to benefit from taking part in the peer review process. For instance, a recent Nature publication discussed that, by reviewing scientific literature, ECRs grow better versed in their field of expertise’s current state of knowledge, develop skills that are often overlooked in postgraduate educational programs, and improve their online visibility by having their names attached to a review (Dance, 2022). Many education researchers suggest that engaging ECRs in scientific writing and reviewing as early as possible—whether that be through undergraduate thesis writing, courses centered on learning how to peer review in graduate programs, or active involvement in reviewing journal publications—creates ECRs capable of publishing higher quality literature than their peers (Reynolds and Thompson, 2017). With reviews done to improve academic integrity, Rodríguez-Bravo et al. (2017) suggest that ECRs learn a lot about their own research methods when scrutinizing the work of other academics and develop tools useful for their own careers in academic publishing (e.g., a more comprehensive understanding of the review process and exposure to academic writing and presentation styles).
Reviewing is a critical skill that many academics learn through experience and interactions rather than through a structured mechanism. While journals may shy away from including ECRs in the review process due to a perceived lack of experience, we jeopardize the continuity of this key process by not providing ECRs the opportunity to participate (Walker, 2018). As peer review continues to be central to the production of high-quality publications, providing opportunities for ECRs to engage in review processes ensures that new reviewers gain competent skills in a supportive and encouraging environment (Casado, 2018), helping to guarantee future academic integrity (McNair et al., 2019). It is encouraging that some publishers and organizations are starting to recognize the need to include ECRs in the peer review process. For example, the European Geosciences Union1 offers hands-on peer review training to ECRs, and adds them to the reviewers database of its journals. In addition, including ECRs in peer review activity may reduce the bias, based on prestige (Lee et al., 2013), that some reviews may encounter while considering mainly over-solicited senior scientists. Finally, the inclusion of ECRs in the review process of IPCC reports helps produce more inclusive assessment reports, especially when the ECRs come from different socio-cultural and economic backgrounds, since ECRs mitigate less exhaustive knowledge with less perceptual biases of what the field should be. While ECRs perform the majority of scientific research and are the most gender diverse group within the wider research community, they are also the most vulnerable (Castelló et al., 2015; Heggeness et al., 2017). The COVID-19 pandemic had a particularly negative impact on ECRs; ECRs reported a nearly three-fold increase in burnout, significant reductions in productivity, and worsening mental health as a consequence of the pandemic, which has the potential to disrupt the trajectories of their careers (Harrop et al., 2021). Given the influence that scientific rigor exerts on the timeline and resources required for research output, COVID-19 had diverse impacts on scientists globally (Myers et al., 2020; Gao et al., 2021; Jawak et al., 2021). For example, Myers et al. (2020) highlight that existing gender gaps in science place an undue burden on female scientists, particularly those with young children, who experienced a precipitous decline in the time spent on research during the pandemic. The decrease of new research projects can be traced to multiple influences, including reduced access to facilities and field sites, limited in-person training and mentor access, reduced funding for non-COVID-19-related research, increased teaching demands, and psychological stress (Gao et al., 2021). All these factors have an even greater impact on the productivity and work of ECRs.
While the COVID-19 pandemic significantly affected productivity and opportunities for ECRs (López-Vergès et al., 2021), there were new online opportunities that allowed ECRs to remain active in the scientific community. One such activity was the opportunity to participate in the review of the IPCC reports. The Association of Polar Early Career Scientists (APECS),2 in collaboration with the Past Global Changes Early-Career Network (PAGES ECN),3 the Permafrost Young Researchers Network (PYRN),4 Young Earth System Scientists (YESS),5 and the Mountain Research Initiative (MRI),6 conducted group reviews of the AR6 before and during the pandemic. This endeavor followed the group reviews of the first order draft (FOD) and second order draft (SOD) of the IPCC Special Report on the Ocean and Cryosphere in a Changing Climate (SROCC) conducted by ECRs and led by APECS prior to the pandemic. These group reviews had a significant impact on both the quality and quantity of review comments submitted to the IPCC, as well as the integration of ECRs into the IPCC process (Casado et al., 2020; Gremion et al., 2021). For the FOD, ECRs participating in the APECS-led group review contributed 17% of the total submitted comments, providing on average a slightly greater number of comments than the 388 other expert reviewers who provided feedback on the FOD [31.8 versus 26 comments; Casado et al. (2020)]. While the quantity of comments does not equal quality (van der Veer et al., 2014), Casado et al. (2020) were able to establish that, at least for the ECRs participating in the group review, there was no relationship between career stage (e.g., PhD students vs. postdoctoral researchers) and the number of editorial vs. substantive comments provided. This supports the assertion that the career stage does not significantly influence the quality of review comments.
Furthermore, surveys conducted on group review participants demonstrated that ECRs achieved diverse benefits from the review process, including improved familiarity with the IPCC, capacity building, and international networking (Gremion et al., 2021). Over 90% of group review participants were open to the possibility of participating in future group reviews of the IPCC report drafts, while over 85% were now open to contributing expert review comments as individuals (Casado et al., 2020). Thus, these initial ECRs group reviews prior to the pandemic set the stage for further group reviews of the IPCC report drafts, which included reviews of the FOD and SOD for the WGI and WGII contributions to AR6. In this paper, we analyze data gathered during these six group reviews led by APECS and its partners since 2018 with the aim to assess the group review process. Although the first two group reviews, FOD-SROCC and SOD-SROCC, have been analyzed by Casado et al. (2020) and Gremion et al. (2021), respectively, this study builds upon and notably expands the previous results in three ways. First, it generalizes the studies to the main assessment report, thus dealing with reviews on research on general environmental and climate science rather than just about the Cryosphere, which was the topic of the SROCC. Second, it analyzes a much larger sample size. Considering both FOD-SROCC and SOD-SROCC together, a total of 161 individuals participated as reviewers, and 92 survey responses were collected. Considering the last four group reviews together, a total of 446 individuals participated as reviewers, and 298 survey responses were collected. Thus, the analysis of the six group reviews allows us to extract more solid conclusions regarding these group reviews. Third, in contrast to the study of Casado et al. (2020) and Gremion et al. (2021), who focus on the survey responses and the reviewers’ comments, this study also analyzes the characteristics of the applicants.
The main objective of this study is to evaluate this group review process of the IPCC assessment reports in terms of its contribution to the production of scientific knowledge, and as a career development opportunity for ECRs. More specifically, we aim to answer the following questions:
• Do these group reviews have the potential to counteract perceptual biases in the IPCC review process by engaging a diverse pool of ECRs?
• Were discussions among the reviewers, which facilitate active learning and the shared production of knowledge, encouraged during these group reviews?
• Does the career stage of ECRs determine the value of their contribution —measured as the number of substantial comments— to the review of the IPCC reports?
• How can the organization of these group reviews be improved to better align with the needs and expectations of ECRs?
• Should the IPCC formally include the review of its reports by ECRs as an essential component of its assessment cycle?
2 Materials and methods
2.1 Organization of the group review of the IPCC reports
APECS, in collaboration with PAGES ECN, PYRN, YESS, and MRI, organized six group reviews of the IPCC reports between 2018 and 2021 (Table 1). The organization of these reviews is challenging since the IPCC sets deadlines that must be met, with the IPCC review period usually lasting somewhat less than 2 months. Each group review was coordinated by a team composed of a Project Group Leader, Lead Chairs, and Chairs affiliated to one or more of the collaborating early career networks. The organization of the group reviews of the FOD-SROCC and SOD-SROCC is described in Casado et al. (2020) and Gremion et al. (2021), respectively. The organization of the last four group reviews is described here.
Table 1. Group reviews, name of the (co-)organizing associations, and number of members of the organizing teams.
The main stages were similar for all the group reviews (Figure 1). The IPCC released information on the timeline of the review cycle through a press release on its website. From this information, the organizing team prepared and released a call for applications to participate in the group review. The call for applications was addressed to ECRs, defined as individuals who fall into one of the following categories: master’s degree student, PhD student, postdoctoral researcher, or early career scholar. Once the application call was closed, the organizing team selected applicants based on research experience and motivation, balancing the selection of applicants based on gender, career stage, and country of residence or citizenship. The number of selected applicants for each review also depended on the number of team members since the latter are responsible for reviewing the comments of the former.
Figure 1. Main stages of the group reviews of the IPCC reports organized by APECS, PAGES ECN, PYRN, YESS, and MRI. The approximate time taken is shown for some of the stages.
The selected applicants (“participants”) were provided with training on how to review the IPCC reports. This training consisted of a “Guide for reviewers” and a webinar organized in collaboration with the IPCC staff. The participants were assigned different sections of the report for review. Some members of the organizing team also provided comments as a reviewer. The organizing team then merged and reviewed all the comments for quality (e.g., removing duplicate or irrelevant comments) to produce the final comment sheet. This comment sheet was then submitted through the IPCC platform on behalf of the group. Finally, the participants were asked to fill in a survey to provide feedback on their experience in the group review process. The time between the release of the call for applications and the submission of the comments to the IPCC was approximately 6 months. For instance, the call for reviewers of SOD-WGII-AR6 was released on 23 July 2020 (APECS, 2020) and the comments were submitted to the IPCC at the end of January 2021. The exception was SOD-WGI-AR6 since the IPCC extended the review period by 6 weeks due to COVID-19 (IPCC, 2020).
In addition to acquiring review experience, the efforts of reviewers are officially recognized in two different ways. First, after the group review process, participants receive a certificate of participation signed by the representatives of each participating organization and by the IPCC co-chairs of the IPCC working groups. Second, the names of all reviewers are included in the annex of expert reviewers that is part of the final IPCC report.
2.2 Data collection and analysis
Data were collected at three different stages during each review process. First, prospective participants completed an online application form. Second, the review comments provided by the participants (i.e., the selected applicants) were compiled. Third, participants were asked to complete an online feedback survey following the group review.
Both the application form and the survey contained a combination of open-ended and multiple-choice questions. The responses to the open-ended qualitative questions were analyzed following thematic analysis (Braun and Clarke, 2006; Nowell et al., 2017). Thematic analysis is used to analyze free text to identify and generate themes, that is, patterns from within it. This involves becoming familiar with the data set (iteratively reading the responses and understanding them), creating initial codes (attaching labels to different sections of the text), generating themes based on the respective codes (grouping different codes together) and applying this to the context of our research questions. Sentiment analysis (Pang and Lee, 2008; Bordoloi and Biswas, 2023), also known as opinion mining, was utilized to review responses to the survey about the training. This analysis involves natural language processing and text analysis to identify and extract subjective information from the text, aiming to determine whether the sentiment expressed is positive, negative, or neutral. Detailed information about the design of the application and survey forms, as well as the categories that have emerged from these datasets is presented below.
2.2.1 Applications
The application form was designed to gather demographic information as well as information about the applicant’s research expertise, review experience, motivation to participate in the group review, and expectations about the group review process. Analyses of the socio-demographics of the applicants have the potential to reveal whether a particular socio-demographic group or country is underrepresented, thus indicating the groups that the organizers of future group reviews should focus on reaching out to. The other information gathered is valuable to inform decisions on how to improve the organization of these group reviews and adapt them to the needs and expectations of ECRs. Not all application forms for all reviews contained the same questions, as the form was improved from review to review. Table 2 indicates the questions analyzed here and the respective reviews.
Table 2. Number of total applicants per group review of the different IPCC reports, and overview of the questions that were asked per application form (only the questions analyzed here are shown).
Before proceeding to the analysis of the application data, invalid applications were removed. This included duplicate applications and applications from Chairs, who automatically had the right to be reviewers. In addition, 10 (4%) and 20 (5%) individuals who applied for the SOD-WGI-AR6 and SOD-WGII-AR6, respectively, stated that they did not allow the organizers to use their answers anonymously for statistical purposes. Thus, their applications were removed from the data analysis.
The application data contained qualitative and quantitative responses, some of which were classified into different categories. The qualitative responses that were categorized pertain to the questions about career stage, discipline, participation in previous group reviews, motivation to participate in the IPCC group review, and the main challenges when reviewing scientific reports. The responses to the question about career stage were classified into the following categories: (1) undergraduate student, (2) master’s student, (3) PhD student, (4) postdoctoral researcher, (5) Early Career researcher/faculty (scientists and faculty who are ECRs), (6) faculty/research staff, (7) working in industry, and (8) other. Thirty disciplines were found across all group reviews. The discipline of Earth Sciences included at least two of the following: Geophysics, Geosciences, Paleontology, Climatology, Remote Sensing, Oceanography, Meteorology, among others. Note that many respondents listed more than one discipline. In those cases, if two of those disciplines were among those included in Earth Sciences, they were assigned to Earth Sciences. The question about participation in previous group reviews was asked slightly differently in the various application forms for the different group reviews. To be consistent, applicant answers were categorized as “yes” and “no,” depending on whether or not the applicant had participated in one or more previous group reviews of the IPCC reports organized by APECS and its collaborators (or only APECS). Furthermore, the motivation to participate in the IPCC group review was divided into the following categories: (1) learn more about the IPCC, (2) learn more about the review process, (3) improve their CV, (4) use their skills to contribute to science policy, (5) become familiarized with the current literature, (6) participate in international teamwork and networking, and (7) other. Regarding what the applicants think is the main challenge when reviewing a scientific report, such as the IPCC AR6, the answers were categorized as shown in Table 3. Since the responses of the applicants to the questions about motivations and main challenges fit in more than one category, the analysis of the responses to these two questions focuses on the percentage of responses that are classified in each category.
Table 3. Name and definition of the categories for the responses of the applicants regarding what they think is the main challenge when reviewing a scientific report.
Regarding the quantitative responses, the question about the time the applicants expected to spend on the review was classified into different ranges of hours: (a) 0–16 h, (b) 17–40 h, (c) 41–80 h, (d) 81–160 h, and (e) more than 160 h. The time the applicants expected to spend on the review was expressed in hours in the responses of the SOD-WGII-AR6, but the responses were not always expressed in hours in the SOD-WGI-AR6. Therefore, to normalize the data for comparison, the answers were converted to hours, where one working day was considered equal to 8 h and, when responses were given in intervals of hours, the middle point of the interval was selected.
There are some limitations to the data that were gathered. One of the challenges when analyzing the responses to the application forms was that the forms differed across group reviews. Overall, the application form of each group review contained the questions included in the form of the previous review, sometimes improved, in addition to new questions. Since some questions were optional, some answers were missing (Table 2). In addition, some answers could not be classified (Table 2). Note that missing and unclassified responses were not included in the analysis.
2.2.2 Reviewers’ comments
In total, the six reviews received 14,532 comments, which corresponds to 2,422 ± 532 comments on average per review (Table 4). All comments, with the exception of 1,562 comments from the SOD-SROCC (n = 635), FOD-WGI-AR6 (n = 824), and FOD-WGII-AR6 (n = 103), were sorted by the organizers of the group review as substantive (comments indicating errors in the content, need for new or more relevant references, or suggesting new missing content), editorial (comments mainly suggesting copy-editing or corrections of how a reference was referred to), or duplicates/unfit (in rare cases, some comments were inappropriate and not communicated to the IPCC). The goal of the organizers was to sort comments in such a way that the content shared with the IPCC authors was of high quality. This also led to some comments not being sorted, though the relatively small number of non-submitted comments is not expected to affect the statistical tests presented here.
2.2.3 Survey responses
The survey was designed with the aim to obtain feedback from participants that would help improve the group review process. The questions asked in the surveys pertain to the following main themes: demographic information, motivation to join the group review, previous review experience, experience in this group review, feedback on the training provided for the review of the IPCC report, feedback on communication with the team members and with the other reviewers, and general feedback about the group review. As in the case of the application forms, each group review employed distinct survey forms. In general, each group review survey contained questions from previous review surveys, at times reworded to improve clarity, in addition to new questions. Table 5 indicates the questions analyzed per survey. When participants submitted multiple responses to a survey, only the first response was considered. The number of valid (i.e., not duplicate) responses to the survey ranged from 41 (SOD-SROCC) to 117 (SOD-WGII-AR6).
Table 5. Number of respondents per group review of the different IPCC reports, and overview of the questions analyzed here per survey.
The survey responses contained qualitative and quantitative responses, some of which were classified into different categories. Regarding the question about the time the reviewers had spent working on the whole review process, the total time range was divided into the following categories: (a) 4–6 h, (b) 6–12 h, (c) 12–24 h, and (d) more than 24 h. The same categories were used for the question about the maximum amount of time the respondents would have agreed to provide for the review, in addition to a new category for those who did not remember. The categories used to analyze these two questions are different from those used in the analysis of the application data since, in some of the surveys, the answers were limited to specific ranges of times.
Regarding the qualitative responses, responses to the questions asking if the training (overall training, webinar, and guide) was adequate were coded as 1 or 0 based on their content. For a given question, if a response was directly indicated as “yes” or rated between 3 and 5 on a linear scale, it was coded as “yes” (1), “no” (0), and “N/A” (0). If the response was directly indicated as “no” or was rated between 0 and 2 on a linear scale, it was coded as “yes” (0), “no” (1), and “N/A” (0). If there was no response to the question or the response was indicated as “not sure,” it was coded as “yes” (0), “no” (0), and “N/A” (1). The responses to the question about how to improve the training were classified into eight categories, and the comments on the webinar and the guide were classified into 11 categories. Additionally, responses with comments and suggestions on how to improve the training were further categorized into themes to identify “positive,” “negative,” and “neutral” sentiments. The responses to the question of whether communication with the organizers had been satisfactory were divided into three categories: (1) yes, (2) no, and (3) maybe/unknown. Since in the last two surveys the answer to the question was given on a scale from 1 to 5, the answers were categorized as follows: (a) no (1–2 on the scale), (b) unknown (3 on the scale), and (c) yes (4–5 on the scale). As for the question about whether the respondent had been encouraged to communicate with other participants, the responses were divided into three categories: (1) yes, (2) no, (3) maybe/unknown. The responses to the question about the most valuable aspect of participating in the group review were divided into seven categories: (1) communication skills, (2) contributing to an international project, (3) learning about the IPCC process, (4) networking, (5) professional growth, (6) reviewing training and/or experience, and (7) scientific update. Some responses were assigned to more than one category.
There are two main limitations of this analysis. First, the forms differed across group reviews. The reason is that, initially, there was no intention of doing other group reviews. The decision to continue organizing more reviews was made when it was evident that ECRs appreciated this opportunity as much as IPCC authors valued the comments provided (personal communication). Moreover, the composition of the team and its leader changed over time. Thus, these group reviews have been an evolving process, with improvements being made over time. Second, not all participants responded to the survey and, among those who did, not all respondents provided answers to every question. Therefore, the results shown here do not necessarily reflect the experiences of all participants. Nevertheless, the responses to the survey provide useful information about whether the group review met the expectations of the respondents and offer constructive input on potential areas for improvement.
3 Results and discussion
3.1 Applications
The number of total applicants and selected applicants is shown in Figure 2. The number of applicants notably increased over time from the first (146) to the last (376) group review. However, this increase was not consistent, as can be seen with the decrease in applications for the FOD-WGII-AR6. In general, as the number of applicants increased, the number of selected applicants increased accordingly. This rise in the number of selected applicants relates to the number of Chairs growing as well, thus increasing the capacity of the organizing team to lead the work of a larger number of reviewers. Furthermore, the percentage of selected applicants per review gradually increased with time from the FOD-SROCC (51%) to the SOD-WGI-AR6 (60%), with the exception of the SOD-WGII-AR6 (55%).
Figure 2. Number of applicants (dark green), selected applicants (light green) and percentage of applicants selected (beige) for the different reviews.
Applicants mainly learned of the opportunity to participate in the four group reviews of the WGI-AR6 and WGII-AR6 through mailing lists, most frequently the APECS mailing list. Most applicants (between 73 and 89%, depending on the group review) had never participated in a group review of the IPCC report organized by APECS, or by APECS and its collaborators. The percentage of applicants who were affiliated with an ECR organization increased from the FOD-WGI-AR6 (72%) to the SOD-WGI-AR6 (83%), with the exception of the SOD-WGII-AR6, in which only 62% of applicants were affiliated with an ECR organization. The percentage of applicants with more than one affiliation with ECR organizations has remained relatively constant throughout the group reviews, ranging from 12 to 18%.
As the number of applicants increased, so did the number of countries of residence represented: from 25 countries represented in the FOD-SROCC to 54 in the SOD-WGII-AR6 (Figure 3). It is noticeable that, despite the decrease in the number of applicants for the FOD-WGII-AR6 compared to the previous group review, the number of countries of residence represented remained the same. It is important to highlight that not only the number of countries increased, but Latin America, Africa and Asia gained representation over the course of the group reviews. Despite these gains, four countries remain overrepresented among all group reviews, namely the United States, Canada, the United Kingdom and Germany. In summary, the increase in the number of applications with time has been accompanied by an increase in diversity in terms of the number of countries of residence represented, though certain countries remain overrepresented.
Figure 3. Place of residence of those who applied to participate in all six group reviews (top figure) and in each group review (bottom figures). Orange colored countries indicate that residents of that country applied to participate in a review. Number labels indicate representation of residents of a particular country across all reviews. For example, a 6 means that residents of that country applied to participate in all six rounds of reviews. The boundaries shown are for illustrative purposes only and are not intended to convey a position on the legal status of disputed areas. Readers are advised to refer to authoritative and up-to-date sources for the latest information on geopolitical realities and territorial disputes.
Given that it is common in academia to move from one country to another, a question about the country of citizenship was included in the application forms of the last three group reviews. Similar to the country of residence, the number of countries of citizenship increased in line with the number of applications, from 41 for the FOD-WGII-AR6 to 62 for the SOD-WGII-AR6, indicative of increasing cultural diversity of applicants over time (Figure 4). The main countries of citizenship represented were India and the United States, followed by Canada and Germany. Regarding the number of applicants for both the country of residence and country of citizenship, there are few statistically significant differences among the group reviews over time.
Figure 4. Same as Figure 3, but for countries of citizenship. The boundaries shown are for illustrative purposes only and are not intended to convey a position on the legal status of disputed areas. Readers are advised to refer to authoritative and up-to-date sources for the latest information on geopolitical realities and territorial disputes.
It is important to note the percentage of applicants coming from low-income, lower-middle-income, upper-middle-income and high-income economies, as defined by the World Bank.7 The countries of residence of the applicants for the first two group reviews were overwhelmingly high-income economies (90 and 88% for the FOD-SROCC and SOD-SROCC, respectively). For the other reviews, the countries of residence of most applicants remained mainly high-income economies, but the percentage was smaller (between 60 and 78%). The representation of low-income economies was negligible (0–1%) for all reviews. The overrepresentation of high-income countries is smaller when considering the country of citizenship. For the last three reviews, the country of citizenship of between 50 and 62% of the applicants was a high-income economy. The second most-represented economies were lower-middle income (between 22 and 29%), followed by upper-middle income (between 15 and 20%). For each review, only 1% of applicants had a country of citizenship categorized as a low-income economy.
Only the application forms of the SOD-WGI-AR6 and SOD-WGII-AR6 included a question about gender. Considering both reviews together, virtually all applicants except for three persons selected either female or male in their answer. Only one person wrote non-binary as a response, and two persons answered that they preferred not to say. More men (54% of applications) applied for the SOD-WGI-AR6, and more women (54% of applications) applied for the SOD-WGII-AR6. However, these differences were not statistically significant (Chi-square = 5.01, p-value > 0.17).
The analysis of the career stage of the applicants found that, in all but one group review, the dominant category was PhD student, followed by postdoctoral researcher (Figure 5). The SOD-WGI-AR6 is the only group review with more postdoctoral researchers (39%) than PhD students (29%). The third most frequent career stage was Early Career researcher/faculty, except in the SOD-SROCC, where the number of applicants who were faculty/research staff was higher. Considering the last two group reviews together, the distribution of women and men per career stage was similar. The percentage of students (undergraduate and graduate) was higher for women (51%) than for men (45%), the percentage of postdoctoral researchers was similar for women (30%) and men (28%), and the percentage of Early Career researcher/faculty, faculty/research staff was higher for men (23%) than for women (15%).
Each group review included a variety of disciplines, with no discipline dominating in all reviews. Earth Sciences were the most represented among the applicants’ disciplines and are among the top four disciplines for all the reviews. Other highly represented disciplines include Atmospheric Sciences and Cryospheric Sciences (both in the top four in five group reviews), followed by Oceanography (in the top four in four group reviews). Geography, Biology, and Ecology were also represented in the top four, but with fewer occurrences. Regardless, a Chi-square test showed that the type of discipline is significantly dependent on the review completed (p-value <0.05). As such, though certain disciplines are well-represented across all reviews, each review attracted specific disciplines.
The main motivation to participate in both the SOD-WGI-AR6 (62% of responses) and the SOD-WGII-AR6 (74% of responses) was learning more about the review process (Figure 6). For the other motivation categories except for “other,” the percentage of responses ranged from 20 to 32% for both reviews. The other major motivations to participate in the group review of the SOD-WGI-AR6 were to use one’s skills to contribute to science policy (32%), learn more about the IPCC (31%), and participate in international teamwork and networking (30%). For the SOD-WGII-AR6, the other major motivations were to improve one’s CV (32%), use one’s skills to contribute to science policy (31%), and participate in international teamwork and networking (27%).
Figure 6. Percentage of applicants’ responses in the SOD-WGI-AR6 (dark green) and SOD-WGII-AR6 (light green) that were classified in each motivation category.
Regarding the previous experience of the applicants, the application forms gathered information about their review and publication experience. Most of the applicants (71–72%) for the first four group reviews already had experience reviewing scientific articles, proposals or reports. For the last two group reviews, the percentage of applicants with review experience was even higher (84% for the SOD-WGI-AR6 and 79% for the SOD-WGII-AR6). For the last two group reviews, the percentage of applicants with experience reviewing scientific articles, scientific reports and scientific proposals is shown in Figure 7. For both reviews, around 70% of the applicants had experience reviewing scientific articles, roughly half of them had experience reviewing scientific reports, and a little less than half of the applicants had experience reviewing scientific proposals. For both reviews, some applicants had experience reviewing more than four scientific reports (13%) and scientific proposals (11–12%), while 40 and 32% of the applicants for the SOD-WGI-AR6 and SOD-WGII-AR6, respectively, had reviewed more than four scientific articles. In summary, although most of the applicants for SOD-WGI-AR6 and SOD-WGII-AR6 already had experience reviewing scientific articles, for half of the applicants, the group review of the IPCC report was an opportunity to gain experience reviewing a scientific report for the first time.
Figure 7. Percentage of applicants for the SOD-WGI-AR6 (top row) and SOD-WGII-AR6 (bottom row) according to their experience reviewing scientific articles (left column), scientific reports (middle column) and scientific proposals (right column). The color corresponds to the number of articles/reports/proposals that the applicants have already reviewed in the past.
Analysis of the number of peer-reviewed articles published by applicants provides insight on their experience level. Most of the applicants for the SOD-WGI-AR6 and SOD-WGII-AR6 (75 and 74%, respectively) had published at least one peer-reviewed article as first author. For each review, roughly half of the applicants had published between one and four articles as first author, and only around a quarter of applicants had published more than four articles as first author. Most of the applicants also had experience as co-authors, with the majority of applicants for the SOD-WGI-AR6 and SOD-WGII-AR6 (76 and 72%, respectively) having published at least one peer-reviewed article as co-author. Less than half of the applicants had published between one and four articles as co-author (43 and 45%, respectively), and a minority had published more than four articles as co-authors (33 and 27%, respectively).
The time the applicants expected to spend on the review ranged from 45 min to 480 h. For SOD-WGI-AR6 and SOD-WGII-AR6, slightly more than half of the applicants indicated between 17 and 40 h (2–5 days), and around a quarter of the responses indicated 41–80 h (5–10 days). A very small percentage of applicants expected to spend 160 h or more. Interestingly, when compiling the average number of hours that the applicants expected to spend working on the two reviews, there was a statistically significant difference (p < 0.05) in the number of hours expected for participants working on SOD-WGI-AR6 (~52 h) compared to SOD-WGII-AR6 (~41 h). Note that the application call stated that “Previous reviewers spent on average 40 h on the process, including training and reviewing” (e.g., APECS, 2020), so it is likely that this information had an impact on the applicants’ response.
According to the applicants, the main challenge when reviewing a scientific report is ensuring that the vast number of references included in the report are adequately synthesized (Figure 8). Half and a third of the applicants for the SOD-WGI-AR6 and SOD-WGII-AR6, respectively, mentioned this challenge in their responses. The second main challenge is ensuring that the report is cohesive and easy to understand, with 29% of the responses for each group review mentioning this challenge. Other important challenges are avoiding leaning towards the viewpoint of stakeholders, and one’s ability to provide valuable feedback on the report. Finally, the time constraint is only mentioned by a relatively small number of applicants.
Figure 8. Percentage of applicants’ responses in the SOD-WGI-AR6 (dark green) and SOD-WGII-AR6 (light green) that were classified in each category regarding what they think is the main challenge when reviewing a scientific report.
3.2 Reviewers’ demographics and comments
The place of residence of reviewers in all six group reviews, and country of citizenship for FOD-WGII-AR6, SOD-WGI-AR6, and SOD-WGII-AR6 are shown in Figures 9, 10, respectively. Note that only those participants that applied for, took part in, and submitted a comment file for the reviews are included. Chairs, who did not have to undergo the selection process, are not included in these figures since the objective is to analyze if the selection of applicants by the team led to a diverse group of reviewers. Across all reviews, 55 countries are represented as a place of residence, of which the majority are from high-income countries (53%), with the remainder split equally between upper-middle and lower-middle income countries. No low-income countries are represented among participants that submitted comment files. While the number of countries represented increased from the first to the last group review, except for FOD-WGII-AR6 (like the pattern observed for applications), the country of residence showed little variation, ranging from 21 for SOD-SROCC, to 34 for SOD-WGII-AR6. Except for SOD-SROCC, the number of countries represented through residence centers around 30. This number is lower than those for applications received and suggests that many potential reviewers do not complete the process for various reasons. In comparison, 77 countries of citizenship are represented across FOD-WGII-AR6, SOD-WGI-AR6, and SOD-WGII-AR6. Citizenship ranged from 40 for FOD-WGII-AR6, to 48 for SOD-WGI-AR6, and 62 for SOD-WGII-AR6. Again, high-income countries are best represented at 42%, followed by lower-middle (28%), and upper-middle (22%) income countries. Low-income countries are also represented at 8%. This indicates that while higher income countries are best represented, lower and lower middle incomes countries do participate. However, these participants are more likely to reside and work outside of their place of birth. It is worth noting that APECS membership has steadily increased (Hindshaw et al., 2018) and an increase in participation from the developing world could thus be expected. However, the reviews show that this is not the case. Thus, to increase developing world participation APECS should actively aim to increase their developing world membership. Another aspect worth noting is the availability of funding for those of the developing world. The success of ECRs often hinges on adequate funding (Moraru et al., 2024). Successful participation in these reviews would be improved by providing funding to those in need of it, such as participants from the developing world.
Figure 9. Place of residence of those who applied for, took part in, and submitted a comment file in all six group reviews (top figure) and in each group review (bottom figures). Colored countries indicate that residents of that country participated in a review. In the top figure, the colors represent the type of country according to the classification by the World Bank. Number labels indicate representation of residents of a particular country across all reviews. For example, a 6 means that residents of that country participated in all six rounds of reviews. The boundaries shown are for illustrative purposes only and are not intended to convey a position on the legal status of disputed areas. Readers are advised to refer to authoritative and up-to-date sources for the latest information on geopolitical realities and territorial disputes.
Figure 10. Same as Figure 9, but for countries of citizenship. The boundaries shown are for illustrative purposes only and are not intended to convey a position on the legal status of disputed areas. Readers are advised to refer to authoritative and up-to-date sources for the latest information on geopolitical realities and territorial disputes.
In what follows, we consider all comments submitted by all reviewers (including Chairs). Of all sorted comments, 36% were substantive, 51% were editorial, and 13% were duplicates/unfit on average (Table 6; Figure 11). There is a significant increase (p < 0.05) of duplicate/unfit comments from the FOD to the SOD reviews (except for the SROCC). This is neither associated with a corresponding increase in reviewers nor with an increase in the relative representation of a specific career stage (Figure 12). Considering that the SOD was the product of the FOD review, it can be assumed that additional substantive comments would be more difficult to make.
Table 6. Percentage of substantive, editorial, and duplicate/unfit comments of all sorted comments per review.
Figure 11. Percentage of substantive, editorial, and duplicate/unfit comments of all sorted comments per review.
Figure 12. Number of reviewers per career stage for each review (left) together with the total number of comments received (black line with diamonds) and the number of comments submitted to the IPCC (black line with squares) (right).
A Kruskal Wallis non-parametric ANOVA t-test at p < 0.05 was used to determine the significance of the number of reviewers, gender, comments per gender, and ratio of the different types of comments given per review. The number of reviewers increased steadily per review, from 61 reviewers for the FOD-SROCC to 186 reviewers for the SOD-WGII-AR6 (Figure 12). There is no significant difference in the relative representation of the different career stages (MSc, PhD, postdoctoral researcher, faculty, other/unknown) per review, indicating that the increase in reviewers was evenly distributed across career stages. Also, there is no correlation between the total number of reviewers and the total number of comments received (Figure 12). We observe no significant difference in the ratio of substantive, editorial, and duplicate/unfit comments between females and males. We also evaluated the ratio of substantive, editorial, and duplicate/unfit comments for the different career stages (Figure 13). No significant difference was found between PhD students, postdoctoral researchers, and faculty. A significant difference was, however, found in the ratio of editorial and substantive comments of master’s students compared to other career stages. Master’s students provided significantly less substantive and significantly more editorial comments. These results inform earlier results found in Casado et al. (2020) and Gremion et al. (2021), for which master’s students could not be analyzed independently because the group was too small. Interestingly, PhD students’ performance as reviewers clearly matches that of postdoctoral researchers and faculty reviewers. Thus, although researchers in earlier career stages provided more editorial comments, their inclusion in the group reviews still led to a considerable number of substantive comments, improving the breadth and strength of the IPCC review process in all cases. Further, any increase in the IPCC engagement of researchers who may contribute to future ARs is valuable in and of itself.
Figure 13. Percentage of substantive, editorial, and duplicate/unfit comments per career stage for all six reviews combined. Percentages marked with an asterisk are significantly different from the rest.
3.3 Survey responses
3.3.1 Motivation
Respondents in all review rounds had more than one motivation, with 95.1% of respondents reporting at least two different motivations. Furthermore, the number of motivating factors appeared to increase along with the increase in the number of options listed in the survey. For example, FOD-SROCC had the highest percentage of responses with one or two motivating reasons, while SOD-WGI-AR6 and SOD-WGII-AR6 had a large majority of applicants reporting five to eight motivations. This may imply that participants were more informed about the IPCC review process, but it may also show that by increasing the number of options in the survey, candidates were more likely to select multiple options, allowing respondents to better define their motivation. More than half of the respondents (51 to 64%) expressed interest in obtaining a certificate of participation. Note that this question was only asked in the first four group reviews. The most common answers supporting a request for a certificate of participation are related to career (i.e., to be included in one’s CV) or for academic or administrative validation. This shows that participants believe that this experience would have a tangible positive impact on their career. Several applicants indicated that they viewed this experience as a reflection of their values, and as a way to validate themselves.
3.3.2 Experience in the review process
Considering all the group reviews together, on average, most respondents spent more than 12 h working on the whole review process, with only 18% spending 4–6 h (Figure 14). Most respondents (31%) indicated that they would have agreed to spend a maximum of 12–24 h, and 29% would have spent over 24 h. Around a quarter of respondents (23%) would have agreed to spend 4–6 h, whereas 15% would have agreed to spend 6–12 h (Figure 14). The comparison between the time reviewers would have agreed to spend and the actual amount of time spent on the review process showed that the majority actually spent more time than initially expected. The majority of reviewers agreed to spend 12–24 h to complete the review process, with 31% spending over 24 h and 28% reporting having spent 12–24 h on the review. Since the application call stated that reviewers spent on average 40 h on the process (e.g., APECS, 2020), these results indicate that not everyone was willing or able to dedicate that amount of time to the process.
Figure 14. Distribution of the percentage of respondents as a function of the number of hours they spent working on the whole review process (dark green) and of the maximum number of hours that they would have agreed to spend on the review process (light green).
For all group reviews, the majority of respondents indicated that there had been factors that had limited the number of hours they were able to spend on the review process (Table 7). Workload was the predominant limiting factor for all reviews except for SOD-WGI-AR6. Personal factors were the second limiting factor in all group reviews. Since the COVID-19 pandemic took place during SOD-WGI-AR6, the participants of that review were asked if the pandemic had affected the number of hours they had been able to dedicate to the review process, with half of respondents indicating that it had. During that group review process, the deadline to submit comments for the group review was moved from April 9 to May 15. From a total of 60 respondents, 11.6% indicated that they could not deliver on time, even with the deadline change. Moreover, 38% of respondents could have sent their review on time without the deadline extension, while 50% of respondents indicated that they would have submitted fewer comments had the extension not been provided.
Table 7. Percentage of responses to the question about whether there were any factors limiting the number of hours the respondents were able to spend on the review process.
3.3.3 Training
In response to the question of whether participants found the amount of training provided to be adequate, 98.5% of respondents (391 individuals) indicated that the training met their needs. Only 1.3% (five individuals) did not find the training adequate (remaining 0.2%, one individual expressed uncertainty regarding the adequacy of the training). We analyzed the 103 recommendations provided in response to suggested training improvements. Common themes emerged, including a desire for more in-depth training sessions, clearer training materials, and additional opportunities for interactive learning. These suggestions were further sub-categorized, resulting in a total of 94 recommendations (excluding similar responses among the 103 received) that contributed to the formation of eight distinct themes. These themes were identified as: Training and Communication, Review Process and Experience, Expertise Alignment and Section Assignment, Collaboration and Networking, Time and Workload, Section Review, Collaboration and Interaction, and Miscellaneous. The themes also provided valuable insight into the overall feedback on training, revealing 74 positive responses, 13 negative responses, and 16 neutral responses regarding areas for improvement. Most of the respondents affirmed that the webinar (84.4%) and the guide (94.4%) provided them with sufficient information on the IPCC assessment process to undertake their review. Only seven respondents expressed dissatisfaction with the content of the webinar, and six respondents expressed dissatisfaction with the guide’s content and structure. Regarding recommendations to improve the webinar and the guide, respondents suggested more interactive and engaging training materials, a need for additional visual aids, and a request for practical, real-world case studies to illustrate concepts. Suggestions to improve the webinar included more detailed content, enhanced clarity, and the need for additional resources for reference. As for the guide, respondents recommended more illustrative examples, improved clarity in specific sections, and case studies. In general, 71.7% of the comments regarding training resources were positive, focusing on review acknowledgment, enhancing interactions among reviewers, and optimizing chapter assignments for expert reviews. An additional 17% of respondents shared feedback about challenges encountered during the review process, such as the timing of training sessions, reviewer assignments, fairness, and review difficulties like expertise mismatches and writing quality. Finally, 11.3% of respondents highlighted areas for neutral improvement, such as reviewer-section alignment, clarity, transparency, and scheduling webinars closer to the review period.
3.3.4 Feedback: communication
In the last five group reviews, a large proportion (85–97%) of the respondents considered communication with the organizing team to be satisfactory, with only the SOD-SROCC showing a percentage below 95%. Overall, less than 3% of respondents perceived the communication as unsatisfactory, except in the SOD-SROCC, where 9.8% of the respondents considered it unsatisfactory. During all group reviews except for the first one, the Slack8 team communication tool was used to facilitate communication between participants. Only the last two group reviews included a question about the usefulness of Slack for communicating with other participants, providing a scale response from one (not useful) to five (very useful). Overall, the respondents of both group reviews agreed on the usefulness of Slack for communicating. However, there was a decrease in agreement, with SOD-WGI-AR6 showing 80% and decreasing to 75.2% for SOD-WGII-AR6 on the 3–5 scale. Furthermore, this decrease was also reflected in the increase on the 1–2 scale, with SOD-WGI-AR6 indicating a 20% disagreement and SOD-WGII-AR6 showing a disagreement of 24.8%.
In FOD-WGI-AR6 and FOD-WGII-AR6, the majority of respondents (65.8 and 71.1%, respectively) felt like they were encouraged to communicate with other participants. However, a relatively large proportion of respondents (15.8 and 17.8%) expressed the opposite, that they were not encouraged to communicate with other participants, while a similar percentage of respondents (18.4 and 11.1%) were unsure. In SOD-WGI-AR6 and SOD-WGII-AR6, this survey question was modified to analyze the actual exchange with other reviewers. In both reviews, approximately 60% of the respondents said that they had not had any exchange with other reviewers during the review process. Although different questions were asked in different reviews, these results may indicate that, even though participants felt that they were encouraged to communicate with each other, most of them ultimately did not. Thus, it would be advisable to develop methods to facilitate communication between participants during future group reviews.
3.3.5 General feedback
The general organization of the last five group reviews was considered satisfactory by 95% of respondents. For SOD-WGI-AR6 and SOD-WGII-AR6, the majority of respondents indicated that reviewing as a group had met their expectations (86%), whilst only 2% were not satisfied and 12% were not sure. Overall, learning about the IPCC process was considered the most valuable aspect of participating in this group review (27% of responses), followed by scientific updates (20%) and contributing to an international project (18%). Respondents also valued the opportunity for professional growth (15%) and acquiring reviewing training and/or experience (13%).
The respondents provided valuable feedback on their overall experience, expressing a desire for certain improvements and highlighting positive aspects of the review process. One common request was for more time and extended discussions with fellow reviewers, allowing for a deeper exploration of the topics at hand. They also emphasized the importance of effective communication channels, especially in light of outside responsibilities and limited availability due to personal circumstances. The respondents recognized the significance of involving scientists from developing countries, emphasizing the need for their perspectives and expertise in the review process. The well-organized nature of the review process received praise from many reviewers and IPCC authors (personal communications). They commended the clear structure and coordination efforts that contributed to a smooth and efficient experience. However, a few reviewers encountered challenges with section assignments and email communication, which resulted in confusion and last-minute attempts at completing tasks. Suggestions were made to address these issues by implementing redundant methods of contact and providing clearer instructions regarding section assignments. Despite these challenges, the overall sentiment was positive, with reviewers expressing their appreciation for the opportunity to contribute to the review process. They highlighted the value of participating as reviewers and expressed a willingness to continue engaging in future reviews. Collaborative work and interactions among reviewers were seen as beneficial, with several reviewers suggesting the promotion of discussions and networking opportunities. Given these positive comments, it is not surprising that the majority of respondents (88%) from the six review processes affirmed that they were keen to participate in another group review, while only 2% were contrary and 10% were not sure.
The IPCC report provides opportunities not only to participate in group reviews, but also in individual reviews. The percentage of respondents who affirmed that they would be interested in individually reviewing a chapter of an IPCC report decreased from FOD-SROCC to FOD-WGI-AR6, and then increased from FOD-WGII-AR6 to SOD-WGII-AR6. Likewise, the percentage of survey respondents who affirmed that they individually reviewed other section of the report decreased from FOD-WGI-AR6 to SOD-WGII-AR6 (note that the responses to the FOD-WGII-AR6 survey were not analyzed since only two individuals answered this question). The majority of them chose to review an additional portion or an entire chapter for their own interest, relevance to their work, or to improve their overall understanding. Regardless, the amount of substantive comments made indicate the usefulness of including ECRs in such reviews. As such, reviews of this kind should be included in upcoming IPCC reviews or even made part of the official review process. Not only are substantive comments received, but this also offers the opportunity to train ECRs in review processes, leading to skills transfer.
4 Discussion
As Solomon et al. (2008) state, “IPCC assesses the literature; it does not conduct new research.” The IPCC reports are shaped by particular perspectives, albeit partially improved by the thorough expert review process (Ford et al., 2016). As an open review process, it reduces common social biases in the review process such as the ones listed by Lee et al. (2013), which are Prestige Bias, Nationality Bias, and Gender Bias. The inclusion of ECRs in these group reviews has encouraged their participation, which has been steadily increasing over time, as demonstrated by the growing number of applicants since the first review in 2018. Most applicants were PhD students, followed by postdoctoral researchers. The diversity of applicants in terms of country of residence and citizenship has also increased since the 2018’s review. Regarding the countries of residence of applicants, they were mostly high-income economies, as defined by the World Bank, and the representation of low-income economies was negligible. For countries of citizenship, the overrepresentation of high-income countries persisted, but was smaller. Although the imbalance may stem from bias within the organizers’ membership, the situation could also be attributed to the low priority given to research in many low-middle-income economies (Sharma et al., 2023). This situation results in insufficient research funding, a lack of research culture, limited awareness about research, and a shortage of research personnel as stated by Sharma et al. (2023), thus contributing to the observed disparity. There was no statistically significant difference between the percentage of female and male applicants. However, we observed a consistent decline in women in higher career stages (Pico et al., 2020), which may reflect the loss of women along the academic career path, consistent with challenges reported in the existing literature (Williams and Ceci, 2012; O’Connell and McKinnon, 2021). Overall, the applicants belonged to a diverse array of disciplines, with no particular discipline dominating all reviews; instead, each review attracted individuals from specific disciplines. However, the majority of those disciplines are natural sciences ones. This may seem consistent with the choice of reports reviewed, but lack of inclusivity of social sciences perspective in the topics covered has been previously pointed out to the IPCC for previous reports as necessary improvements (Vasileiadou et al., 2011; Ford et al., 2016). While many applicants possessed prior reviewing experience, their interest in participating in these group reviews stemmed from a desire to gain insights into the review process. Despite the crucial role reviewers play, most scholars often find themselves learning as they proceed, drawing from the reviews they receive firsthand (Min, 2005; Kerig, 2021).
The vast majority of survey respondents expressed satisfaction with the general organization of the group reviews, the amount of training provided, and communication. Almost a third of respondents stated that the most valuable aspect of participating in this group review was learning about the IPCC report and review process, possibly because they are highly motivated to become actively involved in the IPCC process (van der Veer et al., 2014; Gulizia et al., 2020). As potential areas of improvement, respondents indicated the need for more time and the opportunity to have discussions with other reviewers. Although participants were encouraged to communicate with each other, most of the survey respondents indicated that they did not. Hence, it appears that a barrier exists, hindering the capacity of participants to engage in discussions with one another and with the organizing team. This obstacle may have diminished the considerable value of peer discussion as an efficient method of active learning (Smith et al., 2009), which could have improved the quality of the comments submitted.
Given their positive experience in this group review process, most respondents expressed their willingness to participate in future group reviews. Regarding the effort required from the participants, most survey respondents indicated that they had dedicated more than 12 h to the whole review process, and the majority indicated that the maximum time they would have spent was over 12 h. However, a non-negligible number of respondents stated that the maximum time they would have agreed to spend in the review process was less than 12 h. This latest finding aligns with the typical time of 4 to 5 h spent to review a paper by reviewers (Kovanis et al., 2016). The primary factor constraining the time respondents could dedicate to the review was their existing professional commitments. Indeed, as with most reviews, the reviews were conducted on a voluntary basis, and reviewers did not receive any compensation for their efforts, except the certificate of participation that more than half of survey respondents were interested in obtaining. They were scholars or fellow academics who carved out time from their regular duties, limiting their capacity to fulfill their reviewing responsibilities (Kovanis et al., 2016; Aczel et al., 2021). In addition, in the specific case of SOD-WGI-AR6, another limiting factor was the COVID-19 pandemic.
These group reviews have not only proved to be valuable for the participants, but our findings suggest that these efforts have also benefitted the IPCC review process. The IPCC operates under the fundamental principle that assessment reports, drawing from thousands of research publications, should adhere to the standards of being “comprehensive, objective, open, and transparent” (IPCC, 2013). Essential to fulfilling these objectives is the quality assurance offered by external review (De Vries et al., 2009; Castree et al., 2021). In our study, an average of 2,422 ± 532 comments have been produced per group review, of which 36% were substantive comments. Our reviews were therefore effective, as they were comprehensive in depth (Palutikof et al., 2023) and were performed by underrepresented experts of the global science community in reviewing. Indeed, when considering the type of comments provided—substantial, editorial, duplicate/unfit—, PhD students show the same performance as postdoctoral researchers and faculty reviewers. The performance of master’s students is significantly lower than other career levels, potentially due to less experience with the creation, review, and dissemination of scientific research. The fact that the IPCC has been providing support to these group reviews over several years suggests that these efforts are considered valuable for the review process of IPCC ARs.
The results of this study validate preliminary findings about the quality of ECRs as reviewers (Casado et al., 2020; Gremion et al., 2021), using a larger statistical group, and expanding to a larger array of topics. From these studies, several suggestions can be made regarding the recruitment of potential reviewers and the review process itself.
Regarding the application process, efforts should be made to attract a more diverse group of potential reviewers. Future reviews should strive for greater participation of those from the developing world and identify organizations prevalent in countries of the developing world to improve the reach of the review call. A consideration is to tailor the call to the different regions (developing world, developed world), and to pay greater attention to the balance of reviewers from these different regions. APECS and its co-organizing bodies should approach the IPCC for more formal arrangements for future reviews. This would allow the use of the IPCC reach in distributing the call among countries, hopefully increasing the participation of ECRs from the developing world. Another approach would be to contact relevant programs or institutions of countries in the developing world to assist the distribution of the call among their research community. Furthermore, ECRs active within the different associations organizing these reviews (APECS, YESS, others), and from the developing world could be approached to steer participation from within their countries to encourage greater participation. Finally, funding can be made available to ensure access to the review. This could include leave-funding (to allow ECRs to complete the review by taking leave from their current employer), or (Internet) data funding (to ensure ECRs have access to the review items). Providing equitable access to funding for ECRs is essential to ensure that they have the opportunity to contribute to scientific research (Moraru et al., 2024). Regarding gender diversity, it is encouraging that the percentage of men and women applying to participate in the last two reviews (the only for which we have data about gender), was similar. Enlarging the potential choices of gender in the application form (which only included “male,” “female” and “other”) is recommended to increase the number of applications from people who identify as a gender other than male or female. In addition, the application form should be modified to inform potential participants of a realistic range of average number of hours that reviewers usually dedicate to the process so as to avoid deterring potential reviewers from submitting their applications.
Regarding the group review process, the following suggestions are made for each aspect of the process:
• Training: As underlined by Smith (2006), training ECRs to review may contribute to continuously improving the review quality. More comprehensive training webinars and materials should be provided to the reviewers. In particular, the training should include more detailed content, additional resources, and real-world examples to illustrate concepts. It is highly recommended to include hands-on learning so that participants have the opportunity to review some text and receive feedback. Particular attention should be paid to providing better training on how to make substantive comments. Given the time constraint of these group reviews, the training sessions should be organized well before the deadline for the submission of comments, since this would allow for more in-depth training. For instance, twice-yearly sessions could be offered to all ECRs interested in reviewing the IPCC reports. Given that many applicants stated that their primary motivation to participate in the group reviews was learning more about the review process, these training sessions would likely attract many ECRs. Nevertheless, the opportunity to participate in webinars that take place closer in time to the review period should remain to ensure that ECRs acquire or consolidate the knowledge and skills required to provide a high-quality review. Finally, there should be more scheduling of webinars per time zone to allow greater participation.
• Communication: A strategy should be developed to facilitate engaging discussions between reviewers, which would likely lead to more substantial and less duplicate comments. In addition, more effective communication channels between the organizing team and the reviewers should be implemented. Future reviews should, thus, identify free communication platforms, such as Slack used for the previous reviews, and provide training on these to the review participants to improve effective use. Such platforms can also be integrated into productivity software, such as Notion9, to improve the workflow and communication. Important is that these platforms are free-to-use, to prevent participants from developing countries being excluded based on cost of tools used.
• Review: The number of pages assigned to reviewers should be reduced so that the amount of time dedicated to the review process is more realistic given the high workload of ECRs and the little amount of time available for the whole review process. However, reviewers who are interested in reviewing more pages should be provided with the opportunity to do so. The instructions addressed to reviewers regarding section assignments should be clearer. In addition, the organizing team should try to assign participants sections of the report that better correspond to their interests, noting the limitation that two people should be assigned to review the same section.
A common constraint to the implementation of many of these changes is the short review period. Indeed, the time the IPCC provides for the review of its reports is the same regardless of whether the review is individual or made by a group. Since a group review requires additional steps involving the sharing of tasks and discussion among reviewers, we argue that the time allocated to the group review of the IPCC report should be increased. Another limitation of the IPCC group review process is that the comment file is sent to the IPCC on behalf of the group. Therefore, it does not include the names of the individuals who made each comment, so IPCC authors do not know who made each comment. We suggest that the IPCC modifies its group review process so that the names of the reviewers who made each comment are shown.
Based on the amount of substantive comments received from the reviews, APECS and the other ECR organizations should approach the IPCC about formalizing the inclusion of ECRs in future reviews. Furthermore, based on the experience of the previous six group reviews, the organizers of these reviews should create standard operating procedures and best practice documents about the various aspects of the review process. By formalizing the ECR involvement in future reviews, the reach of the ECR group review call would also be greater and potentially reach more participants from the developing world.
Finally, it is important to highlight that the inclusion of ECRs in reviews of the IPCC reports can help reduce the bias inherent in the review process. In effect, the participation of ECRs, who likely have less perceptual biases than senior researchers, in the review of the IPCC reports has the potential to increase the breadth of knowledge included so that perspectives outside the mainstream science are considered. The inclusion of different perspectives is further encouraged by the inclusion of ECRs from different socio-cultural and economic backgrounds. In this sense, a future study could analyze the knowledge production process of group reviews by ECRs by using a framework such as the Serendipity-Mindsponge-3D (SM3D) knowledge management framework (Vuong et al., 2022).
To summarize, ECRs benefit from participating in group reviews of the IPCC reports and, at the same time, they make a substantial contribution to the IPCC review process. Given the success of these group reviews, this effort should be continued for the upcoming IPCC Seventh Assessment Report.
Data availability statement
The original contributions presented in the study are included in the article/Supplementary material, further inquiries can be directed to the corresponding author.
Ethics statement
Ethical approval was not required for the studies involving humans because the objective of this project was to evaluate and improve the group review process organized by APECS, PAGES ECN, PYRN, YESS, and MRI. Therefore, it can be defined as a program evaluation and improvement project, which does not require institutional review board oversight. The studies were conducted in accordance with the local legislation and institutional requirements. The participants (i.e., selected applicants) of the IPCC group review provided their written informed consent to participate in this study. The applicants of the last two groups provided their informed consent to participate in this study, but for the first four group reviews, consent to participate in this study was not asked for. Regardless, for all studies all data was depersonalized, and no individual can be identified from the analyzed data.
Author contributions
MM-I: Conceptualization, Formal analysis, Funding acquisition, Methodology, Project administration, Supervision, Visualization, Writing – original draft, Writing – review & editing. MC: Conceptualization, Formal analysis, Methodology, Supervision, Visualization, Writing – original draft, Writing – review & editing. GGr: Conceptualization, Formal analysis, Methodology, Supervision, Visualization, Writing – original draft, Writing – review & editing. VR: Conceptualization, Formal analysis, Methodology, Supervision, Visualization, Writing – original draft, Writing – review & editing. OA: Formal analysis, Methodology, Visualization, Writing – original draft, Writing – review & editing. CA: Formal analysis, Methodology, Visualization, Writing – original draft, Writing – review & editing. AA: Formal analysis, Methodology, Visualization, Writing – original draft, Writing – review & editing. FB: Formal analysis, Methodology, Visualization, Writing – original draft, Writing – review & editing. CB: Formal analysis, Methodology, Visualization, Writing – original draft, Writing – review & editing. HB: Formal analysis, Methodology, Visualization, Writing – original draft, Writing – review & editing. JC: Formal analysis, Methodology, Visualization, Writing – original draft, Writing – review & editing. NC: Formal analysis, Methodology, Visualization, Writing – original draft, Writing – review & editing. EC: Formal analysis, Methodology, Visualization, Writing – original draft, Writing – review & editing. MR: Formal analysis, Methodology, Visualization, Writing – original draft, Writing – review & editing. HD: Formal analysis, Methodology, Visualization, Writing – original draft, Writing – review & editing. RD: Formal analysis, Methodology, Visualization, Writing – original draft, Writing – review & editing. GGa: Writing – original draft, Writing – review & editing, Formal analysis, Methodology, Visualization. HG: Formal analysis, Methodology, Visualization, Writing – original draft, Writing – review & editing. SH: Formal analysis, Methodology, Visualization, Writing – original draft, Writing – review & editing. CH: Formal analysis, Methodology, Visualization, Writing – original draft, Writing – review & editing. VH: Formal analysis, Methodology, Visualization, Writing – original draft, Writing – review & editing. JH: Formal analysis, Methodology, Visualization, Writing – original draft, Writing – review & editing. TJ: Formal analysis, Methodology, Visualization, Writing – original draft, Writing – review & editing. SJai: Formal analysis, Methodology, Visualization, Writing – original draft, Writing – review & editing. SJaw: Formal analysis, Methodology, Visualization, Writing – original draft, Writing – review & editing. ML: Formal analysis, Methodology, Visualization, Writing – original draft, Writing – review & editing. JM: Formal analysis, Methodology, Visualization, Writing – original draft, Writing – review & editing. JF: Formal analysis, Methodology, Visualization, Writing – original draft, Writing – review & editing. RO'H: Formal analysis, Methodology, Visualization, Writing – original draft, Writing – review & editing, OO: Formal analysis, Methodology, Visualization, Writing – original draft, Writing – review & editing. RP: Formal analysis, Methodology, Visualization, Writing – original draft, Writing – review & editing. EA: Formal analysis, Methodology, Visualization, Writing – original draft, Writing – review & editing. SR-V: Formal analysis, Methodology, Visualization, Writing – original draft, Writing – review & editing. PR: Formal analysis, Methodology, Visualization, Writing – original draft, Writing – review & editing. SR-P: Formal analysis, Methodology, Visualization, Writing – original draft, Writing – review & editing. VS: Formal analysis, Methodology, Visualization, Writing – original draft, Writing – review & editing. MS: Formal analysis, Methodology, Visualization, Writing – original draft, Writing – review & editing. DV: Formal analysis, Methodology, Visualization, Writing – original draft, Writing – review & editing.
Funding
The author(s) declare that financial support was received for the research, authorship, and/or publication of this article. Publication of this article was funded by the University of Colorado Boulder Libraries through an institutional agreement with Frontiers. CB was supported by Universidad Científica del Sur (Resolución Directoral No. 008-DGIDI-CIENTIFICA-2024).
Acknowledgments
We acknowledge the support of APECS, PAGES ECN, PYRN, YESS, and MRI, in the organization of these group reviews of the IPCC reports. We would particularly like to thank APECS International Directorate and APECS leadership for supporting the “IPCC group review” Project Group throughout all these years. In addition to the authors of this manuscript, other individuals were part of the organizing team of these reviews, and we are grateful for their contribution to this project: Manuel Chevalier (University of Bonn), Isabel Prater (University of Cologne), Priyanka Vadrevu (The Energy and Resources Institute), Gaby S. Langendijk (Helmholtz-Zentrum Hereon), Feba Francis (University of Hyderabad), Jo-Ting Huang-Lachmann (Helmholtz-Zentrum Hereon), Scarlett Zetter (The Arctic University Museum of Norway), Elisa Seyboth (University of Pretoria), Alexis Bahl (University of British Columbia), Kelsey Aho (USDA Forest Service), Rosaria Erika Pileci, Alfonso Fernandez (Universidad de Concepcion), Kyle Mayers (NORCE Norwegian Research Centre), Scott Sugden (McGill University), Emma Davis (University of Waterloo), Ekaterina Uryupova (The Arctic Institute), Arthi Ramachandran (Concordia university), Sarah Shakil (Uppsala University), and other individuals. Finally, we would like to thank Sarah Connors, who worked in the IPCC WG I Technical Support Unit for the AR6, and Elvira Poloczanska and Katja Mintenbeck, who worked in the IPCC WGII Technical Support Unit for the AR6, for making these group reviews possible. Finally, we would like to thank two reviewers for their constructive comments and suggestions, which have helped improve this manuscript.
Conflict of interest
The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.
Publisher’s note
All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article, or claim that may be made by its manufacturer, is not guaranteed or endorsed by the publisher.
Supplementary material
The Supplementary material for this article can be found online at: https://www.frontiersin.org/articles/10.3389/fclim.2024.1395040/full#supplementary-material
Footnotes
1. ^https://www.egu.eu/news/967/egu-peer-review-training-workshop-apply-now/ (Accessed February 18, 2024).
2. ^https://www.apecs.is/ (Accessed February 18, 2024).
3. ^https://pastglobalchanges.org/ecn/intro (Accessed February 18, 2024).
4. ^https://pyrn.arcticportal.org/ (Accessed February 18, 2024).
5. ^https://www.yess-community.org/ (Accessed February 18, 2024).
6. ^https://mountainresearchinitiative.org/ (Accessed February 18, 2024).
7. ^https://datahelpdesk.worldbank.org/knowledgebase/articles/906519-world-bank-country-and-lending-groups (accessed on 18 February 2024).
8. ^https://slack.com/ (Accessed February 18, 2024).
9. ^https://www.notion.so/de-de/product (accessed on 5 June 2024).
References
Aczel, B., Szaszi, B., and Holcombe, A. O. (2021). A billion-dollar donation: estimating the cost of researchers' time spent on peer review. Res. Integrity Peer Rev. 6:14. doi: 10.1186/s41073-021-00118-2
APECS (2020). Call for reviewers of the second order draft of the Working Group II contribution to the IPCC Sixth Assessment Report (AR6) [Online]. Available at: https://www.apecs.is/news/apecs-news/4146-call-for-reviewers-of-the-second-order-draft-of-the-working-group-ii-contribution-to-the-ipcc-sixth-assessment-report-ar6-2.html [Accessed January 6, 2024].
Bordoloi, M., and Biswas, S. K. (2023). Sentiment analysis: a survey on design framework, applications and future scopes. Artif. Intell. Rev. 56, 12505–12560. doi: 10.1007/s10462-023-10442-2
Braun, V., and Clarke, V. (2006). Using thematic analysis in psychology. Qual. Res. Psychol. 3, 77–101. doi: 10.1191/1478088706qp063oa
Cantor, M., and Gero, S. (2015). The missing metric: quantifying contributions of reviewers. R. Soc. Open Sci. 2:140540. doi: 10.1098/rsos.140540
Casado, M. (2018). Engage more early-career scientists as peer reviewers. Nature 560:307. doi: 10.1038/d41586-018-05956-7
Casado, M., Gremion, G., Rosenbaum, P., Caccavo, J. A., Aho, K., Champollion, N., et al. (2020). The benefits to climate science of including early-career scientists as reviewers. Geosci. Commun. 3, 89–97. doi: 10.5194/gc-3-89-2020
Castelló, M., Kobayashi, S., McGinn, M., Pechar, H., Vekkaila, J., and Wisker, G. (2015). Researcher identity in transition: signals to identify and manage spheres of activity in a risk-career. Frontline Learn. Res. 3, 39–54. doi: 10.14786/flr.v3i3.149
Castree, N., Bellamy, R., and Osaka, S. (2021). The future of global environmental assessments: making a case for fundamental change. Anthropocene Rev. 8, 56–82. doi: 10.1177/2053019620971664
Dance, A. (2022). Why early-career researchers should step up to the peer-review plate. Nature 602, 169–171. doi: 10.1038/d41586-022-00216-1
De Vries, D. R., Marschall, E. A., and Stein, R. A. (2009). Exploring the peer review process: what is it, does it work, and can it be improved? Fisheries 34, 270–279. doi: 10.1577/1548-8446-34.6.270
Ford, J. D., Cameron, L., Rubis, J., Maillet, M., Nakashima, D., Willox, A. C., et al. (2016). Including indigenous knowledge and experience in IPCC assessment reports. Nat. Clim. Chang. 6, 349–353. doi: 10.1038/nclimate2954
Gao, J., Yin, Y., Myers, K. R., Lakhani, K. R., and Wang, D. (2021). Potentially long-lasting effects of the pandemic on scientists. Nat. Commun. 12:6188. doi: 10.1038/s41467-021-26428-z
Gremion, G., Casado, M., Aho, K., Caccavo, J. A., Champollion, N., Choy, E., et al. (2021). “What peer-review experiences can offer to early career scientists and to the scientific community,” in Proceedings of the 5th International Young Earth Scientists (YES) Congress “Rocking Earth’s Future”, 5th International Young Earth Scientists (YES) Congress “Rocking Earth’s Future”. eds. T. Rose, G. de Gelder, D. Fernández-Blanco, and M. Sieber (Berlin, Germany), 144–148.
Gropp, R. E., Glisson, S., Gallo, S., and Thompson, L. (2017). Peer Review: A System under Stress. Bioscience 67, 407–410. doi: 10.1093/biosci/bix034
Gulizia, C., Langendijk, G., Huang-Lachmann, J.-T., de Amorim Borges, P., Flach, R., Githaiga, C., et al. (2020). Towards a more integrated role for early career researchers in the IPCC process. Clim. Chang. 159, 75–85. doi: 10.1007/s10584-019-02604-5
Harrop, C., Bal, V., Carpenter, K., and Halladay, A. (2021). A lost generation? The impact of the COVID-19 pandemic on early career ASD researchers. Autism Res. 14, 1078–1087. doi: 10.1002/aur.2503
Heggeness, M. L., Gunsalus, K. T. W., Pacas, J., and McDowell, G. (2017). The new face of US science. Nature 541, 21–23. doi: 10.1038/541021a
Hindshaw, R. S., Mariash, H., Vick-Majors, T. J., Thornton, A. E., Pope, A., Zaika, Y., et al. (2018). A decade of shaping the futures of polar early career researchers: a legacy of the international polar year. Polar Rec. 54, 312–323. doi: 10.1017/S0032247418000591
IPCC (2013). “Appendix a: procedures for the preparation, review, acceptance, adoption, approval and publication of IPCC reports” in Principles governing IPCC work (Georgia: Batumi). Available at: https://archive.ipcc.ch/pdf/ipcc-principles/ipcc-principles-appendix-a-final.pdf
IPCC (2020). IPCC extends review of Working Group I Sixth Assessment Report Second Order Draft [Online]. Available at: https://www.ipcc.ch/2020/04/07/extends-review-wgiar6sod/ [Accessed 6 January, 2024].
Jana, S. (2019). A history and development of peer-review process. Ann. Libr. Inf. Stud. 66, 152–162. doi: 10.56042/alis.v66i4.26964
Jawak, S. D., Andersen, B. N., Pohjola, V. A., Godøy, Ø., Hübner, C., Jennings, I., et al. (2021). SIOS’s earth observation (EO), remote sensing (RS), and operational activities in response to COVID-19. Remote Sens. 13:712. doi: 10.3390/rs13040712
Kerig, P. K. (2021). Why participate in peer review? J. Trauma. Stress. 34, 5–8. doi: 10.1002/jts.22647
Kovanis, M., Porcher, R., Ravaud, P., and Trinquart, L. (2016). The global burden of journal peer review in the biomedical literature: strong imbalance in the collective Enterprise. PLoS One 11:e0166387. doi: 10.1371/journal.pone.0166387
Lee, C. J., Sugimoto, C. R., Zhang, G., and Cronin, B. (2013). Bias in peer review. J. Am. Soc. Inf. Sci. Technol. 64, 2–17. doi: 10.1002/asi.22784
López-Vergès, S., Urbani, B., Fernández Rivas, D., Kaur-Ghumaan, S., Coussens, A. K., Moronta-Barrios, F., et al. (2021). Mitigating losses: how scientific organisations can help address the impact of the COVID-19 pandemic on early-career researchers. Human. Soc. Sci. Commun. 8:284. doi: 10.1057/s41599-021-00944-1
Matsui, A., Chen, E., Wang, Y., and Ferrara, E. (2021). The impact of peer review on the contribution potential of scientific papers. PeerJ 9:e11999. doi: 10.7717/peerj.11999
McNair, R., Le Phuong, H. A., Cseri, L., and Szekely, G. (2019). Peer review of manuscripts: a valuable yet neglected educational tool for early-career researchers. Educ. Res. Int. 2019, 1–9. doi: 10.1155/2019/1359362
Min, H.-T. (2005). Training students to become successful peer reviewers. System 33, 293–308. doi: 10.1016/j.system.2004.11.003
Moraru, A., Quaglia, F. C., Kim, M., López-Quirós, A., and Huynh, H. M. (2024). Empowering early career polar researchers in a changing climate: challenges and solutions. PLOS Climate 3:e0000332. doi: 10.1371/journal.pclm.0000332
Myers, K. R., Tham, W. Y., Yin, Y., Cohodes, N., Thursby, J. G., Thursby, M. C., et al. (2020). Unequal effects of the COVID-19 pandemic on scientists. Nat. Hum. Behav. 4, 880–883. doi: 10.1038/s41562-020-0921-y
Nowell, L. S., Norris, J. M., White, D. E., and Moules, N. J. (2017). Thematic analysis: striving to meet the trustworthiness criteria. Int J Qual Methods 16, 160940691773384–160940691773313. doi: 10.1177/1609406917733847
O’Connell, C., and McKinnon, M. (2021). Perceptions of barriers to career progression for academic women in STEM. Societies 11:27. doi: 10.3390/soc11020027
Palutikof, J. P., Boulter, S. L., Field, C. B., Mach, K. J., Manning, M. R., Mastrandrea, M. D., et al. (2023). Enhancing the review process in global environmental assessments: the case of the IPCC. Environ. Sci. Pol. 139, 118–129. doi: 10.1016/j.envsci.2022.10.012
Pang, B., and Lee, L. (2008). Opinion mining and sentiment analysis. Found. Trends Inf. Retr. 2, 1–135. doi: 10.1561/1500000011
Pico, T., Bierman, P., Doyle, K., and Richardson, S. (2020). First authorship gender gap in the geosciences. Earth Space Sci. 7:e2020EA001203. doi: 10.1029/2020EA001203
Reynolds, J. A., and Thompson, R. J. J. (2017). Want to improve undergraduate thesis writing? Engage students and their faculty readers in scientific peer review. Life Sci. Educ. 10, 209–215. doi: 10.1187/cbe.10-10-0127
Rodríguez-Bravo, B., Nicholas, D., Herman, E., Boukacem-Zeghmouri, C., Watkinson, A., Xu, J., et al. (2017). Peer review: the experience and views of early career researchers. Learn. Publish. 30, 269–277. doi: 10.1002/leap.1111
Sharma, S., Verhagen, A., Elkins, M., Brismée, J. M., Fulk, G. D., Taradaj, J., et al. (2023). Research from low-income and middle-income countries will benefit Global Health and the physiotherapy profession, but it requires support. Int. J. Sports Phys. Ther. 18:83948. doi: 10.26603/001c.83948
Smith, R. (2006). Peer review: a flawed process at the heart of science and journals. J. R. Soc. Med. 99, 178–182. doi: 10.1177/014107680609900414
Smith, M. K., Wood, W. B., Adams, W. K., Wieman, C., Knight, J. K., Guild, N., et al. (2009). Why peer discussion improves student performance on in-class concept questions. Science 323, 122–124. doi: 10.1126/science.1165919
Solomon, S., Alley, R., Gregory, J., Lemke, P., and Manning, M. (2008). A closer look at the IPCC report. Science 319, 409–410. doi: 10.1126/science.319.5862.409c
van der Veer, L., Visser, H., Petersen, A., and Janssen, P. (2014). Innovating the IPCC review process—the potential of young talent. Clim. Chang. 125, 137–148. doi: 10.1007/s10584-014-1145-9
Vasileiadou, E., Heimeriks, G., and Petersen, A. C. (2011). Exploring the impact of the IPCC assessment reports on science. Environ. Sci. Pol. 14, 1052–1061. doi: 10.1016/j.envsci.2011.07.002
Vuong, Q.-H., Le, T.-T., La, V.-P., Nguyen, H. T. T., Ho, M.-T., Van Khuc, Q., et al. (2022). Covid-19 vaccines production and societal immunization under the serendipity-mindsponge-3D knowledge management theory and conceptual framework. Human. Soc. Sci. Commun. 9:22. doi: 10.1057/s41599-022-01034-6
Walker, T. R. (2018). Help graduate students to become good peer reviewers. Nature 561:177. doi: 10.1038/d41586-018-06632-6
Keywords: early career researchers, peer review, group review, IPCC, education, climate science, APECS
Citation: Moreno-Ibáñez M, Casado M, Gremion G, Rabanal V, Adojoh O, Anoruo C, Arshad A, Bahar FA, Bello C, Bergstedt H, Caccavo JA, Champollion N, Choy ES, De Los Ríos MF, Detlef H, Dey R, Gamal G, Guímaro HR, Hancock S, Hansen C, Hare V, Höfer J, Jabir T, Jain S, Jawak S, Latonin M, Martin J, Fredy Mojica J, O’Hara R, Onafeso O, Prasath RA, Alves EQ, Raez-Villanueva S, Rosenbaum P, Ruiz-Pereira S, Savaglia V, van Soest M and Vural D (2024) Engagement of early career researchers in collaborative assessments of IPCC reports: achievements and insights. Front. Clim. 6:1395040. doi: 10.3389/fclim.2024.1395040
Edited by:
Monirul Mirza, Environment and Climate Change Canada, CanadaReviewed by:
Muccione Veruska, University of Zurich, SwitzerlandMinh-Hoang Nguyen, Phenikaa University, Vietnam
Copyright © 2024 Moreno-Ibáñez, Casado, Gremion, Rabanal, Adojoh, Anoruo, Arshad, Bahar, Bello, Bergstedt, Caccavo, Champollion, Choy, De Los Ríos, Detlef, Dey, Gamal, Guímaro, Hancock, Hansen, Hare, Höfer, Jabir, Jain, Jawak, Latonin, Martin, Fredy Mojica, O’Hara, Onafeso, Prasath, Alves, Raez-Villanueva, Rosenbaum, Ruiz-Pereira, Savaglia, van Soest and Vural. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.
*Correspondence: Marta Moreno-Ibáñez, marta.morenoibanez@colorado.edu