- 1Institute for the Future of Education, Tecnologico de Monterrey, Monterrey, Mexico
- 2School of Architecture, Art and Design, Tecnologico de Monterrey, Guadalajara, Mexico
- 3School of Engineering and Sciences, Tecnologico de Monterrey, Mexico City, Mexico
- 4School of Humanities and Education, Tecnologico de Monterrey, Monterrey, Mexico
- 5Departament d’Educació Lingüística, Científica i Matemàtica, Universitat de Barcelona, Barcelona, España
- 6Departamento de Matemáticas, Facultad de Ciencias Naturales, Exactas y Tecnología, Universidad de Panamá, Panama City, Panama
As traditional education systems struggle to keep up with technological advances, incorporating open science into Education 5.0 is critical to addressing student skills gaps. In this study, the MAICC model is introduced, a tool designed to foster complex thinking in higher education students through the evaluation of citizen science projects. It integrates research-based learning and service learning, and helps develop critical and reflective skills by applying them to real-life settings. To assess student engagement and skills development, a mixed methods approach combining qualitative and quantitative analysis was used. Findings indicate that the MAICC model promotes complex thinking, enhances critical thinking through citizen science project evaluation, and features an emphasis on citizen science and educational technology. Discussion highlights citizen science’s important role in education and suggests future research exploring its wider application across disciplines and contexts to enhance 21st century skills.
1 Introduction
In the rapidly evolving landscape of education and scientific research, integrating Open Science into Education 5.0 has become increasingly crucial. Advances in technology and the growing volume of scientific knowledge call for education systems that can effectively adapt to and embrace these shifts. However, conventional educational settings often lag behind and struggle to equip students with the skills and knowledge necessary to thrive in a digital and data-driven world. This challenge of adopting Open Science is pivotal in addressing the gap emerging in the development of student competencies (Deev et al., 2020). The urgency behind this need for integration is highlighted by the growing demand for educational models that not only adopt technological innovation, but also foster critical thinking, creativity and collaborative problem-solving skills that are vital to navigating 21st century intricacies. One response is the MAICC Model (Monitoring and Assessment through Integrated Citizen Collaboration), formulated to provide an innovative strategy to overcome this disparity by exploiting the strengths of Citizen Science and educational technology platforms.
The concept of Open Science coupled with the principles of Education 5.0 offers a transformative approach to the teaching-learning process. Open Science advocates for transparency, accessibility and collaborative building of knowledge (Mckiernan et al., 2016), while Education 5.0 emphasizes personalized learning, integrating technological advances and new pedagogical approaches, to better prepare students for the future in problem-solving and critical thinking skills. This new paradigm aims to transform the educational system by incorporating digital pedagogies and technologies of the Fifth Industrial Revolution (Industry 5.0) (Meniado, 2023). Notwithstanding their potential, the uptake of these innovative concepts continues to be hampered by significant hurdles, such as institutional resistance to change, insufficient infrastructure and inadequate training of educators. Overcoming such setbacks is central to realizing the full benefits of Open Science and Education 5.0, notably in terms of fostering inclusive, equitable and high-quality education for all. However, the slow adoption of this binomial in formal education systems has hindered the full realization of its potential. This gap necessitates innovative educational strategies to integrate these concepts into practical learning environments effectively.
The aim of this study is to introduce the process undertaken to develop the MAICC model, which includes testing it on a digital platform for the evaluation of citizen science projects. Students participating in the MAICC Model used the platform for data collection and citizen science project evaluation, intended to develop their critical and scientific thinking. The MAICC Model introduces a dynamic and innovative strategy for contemporary education aligned with the tenets of Open Science and Education 5.0. By integrating Citizen Science as an educational framework, i.e., using citizen science practices and projects for educational purposes and harnessing technology, the model offers a practical and meaningful approach to cultivate complex thinking skills and other competencies in higher education students. Through the recognition, research and evaluation of real Citizen Science projects, the model not only fosters critical thinking and scientific thinking, but it also encourages creativity, digital literacy, and co-creation, essential in today’s technologically driven environment.
Kumar’s (2009) Design Innovation Process methodology was followed to build the MAICC Model as it allows to establish guidelines for the design of an educational model ensuring a comprehensive final proposal. Student participation provides hands-on and experiential learning opportunities based on the binomial of educational approaches with Inquiry-Based Learning (IBL) and Service Learning (SL). This experiential learning framework not only reinforces students’ academic skills, but also promotes their civic responsibility and engagement, and prepares them for active participation in society. The learning process involving students includes enquiry and research, reflective practice, feedback and creative process, which enhances the student experience and the development of competencies. The value of the MAICC Model lies in its potential to transform educational practices. It is a dynamic and innovative approach to modern education, aligning educational strategies with the demands of a digital society; in this sense, it equips students with the necessary competencies for the 21st century. The theoretical framework, methodology, primary results, and conclusions of this study are presented in the following sections.
2 Theoretical framework
2.1 Foundations of education 5.0 and Open Science
Education 5.0 represents an emerging paradigm that emphasizes personalization, interactivity, and the integration of advanced technologies into the learning process (Tavares et al., 2023). This approach seeks to overcome the limitations of previous models by incorporating artificial intelligence, Big Data, and augmented reality to create more immersive and adaptive learning experiences. According to Thornhill-Miller et al. (2023), the essential competencies for the future are creativity, critical thinking, collaboration, and communication, also known as 21st-century skills (Almerich et al., 2020; Astiswijaya et al., 2023; Dwyer et al., 2014; Sánchez et al., 2022), envisioned for the post-pandemic era (Siddiq et al., 2023). These technologies could facilitate the development of critical competencies, such as complex problem solving, to prepare students to face future challenges in a continuously changing world. In addition, this paradigm advocates for a focus on autonomous student learning, where learning is self-directed, and educators act as facilitators, a transition that requires a profound reconsideration of traditional teaching methods.
In this scenario, Open Science is a natural complement to Education 5.0 by promoting transparency, accessibility, and collaboration in scientific research. Open Science democratizes access to scientific knowledge (Kurtulmus, 2021) and fosters a culture of sharing and co-creation at the heart of Education 5.0. Making research data accessible and reusable, Open Science facilitates educational innovation and promotes more significant interactions among researchers and educators. As Haim et al. (2023) point out, encouraging greater engagement with Open Science in the future requires providing knowledge, guidelines, resources, and social and structural support. This enables the creation of teaching materials based on the latest research, ensuring that educational content is relevant, current, and supported by scientific evidence. Integrating Open Science into Education 5.0 enriches the learning process, prepares students to participate in research, and contributes to advancing knowledge (Shyshkina, 2024).
The synergy between Education 5.0 and Open Science materializes in creating learning environments that prepare students to face the challenges of the knowledge society and foster a culture of scientific exchange and collaboration. Effective implementation of Education 5.0, framed within the principles of Open Science, requires a robust technological infrastructure and educational policies that promote the creation and use of Open Educational Resources (OER). According to UNESCO (2021), OER offers unprecedented opportunities for educators to share, use, and reuse content, which is fundamental to developing innovative and accessible educational practices. Adopting these resources facilitates the personalization of learning and promotes educational equity by ensuring that students from diverse backgrounds have access to high-quality materials.
Also, educational technology and digital literacy have become fundamental pillars within Education 5.0, facilitating access to many learning resources and opportunities. Digital competencies include the skills to operate digital devices effectively, utilize communication tools, and engage with online networks for information management, content creation, and collaboration. They involve proficiently using information and communication technologies for critical information handling, content generation, and communication within digital spaces (Sarva et al., 2023; Shyshkina, 2024). The latter is essential for navigating the information age, enabling students and educators to participate in collaborative research and contribute to open science fully. Integrating educational technologies promotes active and personalized learning methodologies, providing platforms for experimentation, simulation, and access to global learning communities. In this context, digital literacy enriches the educational experience and becomes a core competency for continuous professional development in an increasingly digitalized world (Reddy et al., 2020).
2.2 Inquiry-based learning and service learning to foster 21st century competencies
Inquiry-based learning (IBL) is a pedagogical approach that promotes developing critical competencies, such as critical thinking, problem-solving, and the ability to conduct autonomous research. This approach focuses on the student as the primary agent of their learning, encouraging exploration, questioning, and critical reflection on learning content and processes (Leif et al., 2023; Nollmeyer and Baldwin, 2022; Sala Sebastià et al., 2017; Sala Sebastià et al., 2021). Through IBL, students actively engage in knowledge construction by exploring authentic questions, enabling them to develop research and analytical skills essential to their performance in professional and academic contexts. The effectiveness of IBL for developing competencies is supported by constructivist learning theory, which argues that students construct their knowledge through experience and interaction with their environment (Andrini, 2016; Bruner, 1961). Thus, IBL enhances conceptual understanding and promotes the development of cross-cutting skills necessary for lifelong learning.
Service-learning (SL) can be seen as another educational approach which links academic learning with community service. It allows students to apply theoretical knowledge in real situations while contributing to the community (Aramburuzabala and Cerrillo, 2023). This approach is based on the idea that effective learning occurs when students acquire knowledge and skills in the classroom and apply them in real contexts. Combining IBL and SL allows students to apply theoretical knowledge in practical situations, enhancing their learning and engagement. Some studies suggest students develop self-regulation and self-learning processes (Al Mamun and Lawrie, 2023). This joint approach enriches the educational experience by providing authentic and meaningful contexts for learning, preparing students to be competent professionals and active citizens (Ash, 2009). Projects developed under this integrated modality can address complex community challenges, fostering deep reflection and meaningful learning. In addition, this combination supports the development of transversal competencies demanded in the 21st century, such as collaboration, effective communication, and adaptability.
2.3 Citizen science as an educational tool
In the last decade, citizen science has gained recognition as an invaluable educational avenue for active and engaged learning by involving citizens in scientific research projects. Citizen science, seen as a pedagogical approach, taps into individuals’ innate curiosity and interest in learning, allowing them to participate directly in data collection, analysis, and real-world problem-solving (Zhang et al., 2023; Ozden and Velibeyoglu, 2023). According to Lüsse et al. (2022), participation in citizen science projects increases scientific understanding and knowledge among participants and enhances their critical thinking and collaboration skills. Moreover, as an educational modality, it reinforces the connection between science and society (Bonney et al., 2014), promoting the democratization of science and fostering a culture of continuous learning and active participation in science.
Including citizen science in the educational setting can transform science teaching and learning by providing authentic, contextual learning experiences directly applicable to the real world (Roche et al., 2020). Recent research indicates that citizen science projects in schools can significantly increase students’ interest in science and improve their understanding of scientific methods (Ballard et al., 2017). In addition, these projects promote essential 21st-century skills such as data analysis, critical thinking, and effective communication (Nugent et al., 2015). By facilitating collaboration among students, educators, and professional scientists, citizen science as an educational tool also helps develop a broader and more diverse learning community (Alfaro-Ponce et al., 2023), where the contributions of each participant are valued and leveraged. This enriches students’ educational experience and contributes to scientific knowledge advancement, showing the value of diverse voices in the scientific research process.
Moreover, as an educational approach, citizen science offers a setting that blurs traditional boundaries between professional scientists and non-experts. By engaging non-scientists in authentic scientific inquiry, citizen science promotes experiential learning, critical thinking, and a more profound understanding of scientific processes (Turrini et al., 2018). This helps democratizing science by making it accessible to a broader audience and enhances public awareness and appreciation of science (Bonney et al., 2014; Hecker et al., 2018). Integrating Citizen Science into educational contexts has increased student engagement and motivation (Edwards et al., 2023), providing hands-on experiences that link classroom learning to real-world scientific challenges (Phillips et al., 2018).
Evaluating citizen science projects is essential to ensure the quality of data collected, assess scientific and social impact, and improve volunteer participation and learning. Recent proposals in this field include evaluation frameworks to monitor and evaluate projects in this field that integrate metrics of scientific, educational, and citizen engagement, motivation, and impact (Wehn et al., 2021; Calyx and Finlay, 2022; Levontin et al., 2022). One example is the development of digital assessment tools that facilitate collecting and analyzing real-time participant feedback, enabling agile adjustments to projects (Bonney et al., 2014; Theobald et al., 2015). These innovations optimize the benefits of citizen science for the scientific community and the citizens involved.
Assessment makes it possible to identify both the achievements and challenges of the projects, facilitating continuous improvement and the development of best practices in citizen science. Evaluating citizen science projects is crucial in validating the research quality and developing the competencies of the evaluators. Participation in evaluation processes offers a unique opportunity to improve analytical, communicative, and critical thinking skills, which are essential in research (Canaleta et al., 2014; Cooper, 2014; LaVelle et al., 2023). It leads to an enriched understanding of scientific methodologies and the continuous refinement of projects, ensuring their applicability and efficacy. Consequently, evaluation evolves into a reciprocal learning experience, augmenting the value of citizen science projects and the professional growth of evaluators. Contemporary initiatives aim to devise evaluation frameworks that encapsulate various facets of citizen science impact, including the enhancement of participant learning, contributions to scientific knowledge, and socio-environmental advantages (Kullenberg and Kasperowski, 2016). One of the proposals developed into a typology for evaluating science projects, focusing on developing complex thinking as a core objective; Sanabria-Z et al. (2022) suggest that this framework can effectively guide the design and evaluation of citizen science projects with a holistic impact and develop relevant competencies for participants. The MAICC Model has adopted their proposed typology and frame of reference for Citizen Science projects as a fundamental element.
3 Methodology
3.1 Methodology to design the MAICC model
The methodology followed for designing the MAICC Model was based on the Design Innovation Process by Kumar (2009). Kumar’s proposal allows for the consideration of relevant indicators to establish guidelines for the design of an educational model such as the one proposed, these indicators represent a guide in the design process, ensuring that it achieves a final proposal that is as integral as possible. The phases involved in Kumar’s methodology allow us to start from the initial ideas to define the innovation that we are seeking to develop, and it also contemplates the initial analysis work to diagnose needs and context, until establishing the theoretical foundation that supports the proposed design. On the other hand, one of the attributes of the Kumar model is to facilitate collaborative work between researchers, to carry out rounds of ideation and feedback until consensus results are reached. Kumar’s methodology emphasizes the importance of a systematic process incorporating research, analysis, synthesis, and realization to explore and exploit innovation opportunities effectively. This approach addresses both designers and non-designers, encouraging collaborative efforts that address challenges and harness the potential of design thinking for innovation. In this instance, the research team, comprised of researchers with expertise in educational innovation, design, and science communication, followed Kumar’s proposal to present the MAICC educational model as an educational innovation.
Figure 1 displays the elements and stages of this method, depicting the indicative structure and sequence for any project. The following is a description of each section and the sequence established by Kumar, who defined seven stages of the design innovation process, each with objectives.
1. Sense the intention: the objective is to establish an initial “innovation intention” based on an intuitive and tentative perception of the potential to create new value and how to discover it. The problem is defined through a rapid diagnosis, and the innovative intention is stated, which initially specifies the likely users, their needs, potential offerings, and benefits.
2. Know the context: the objective is to grasp the contemporary landscape and its historical evolution within the given context. Facilitating collaboration across disciplines during this phase requires categorizing the gathered findings and insights into universally recognized themes.
3. Know the people: the focus is on understanding the current and prospective users of a new product or service, along with other key stakeholders. The aim is to derive meaningful insights from observations, where an “insight” is identified as a significant discovery or understanding that comes from observing real-life behaviors.
4. Frame the insights: once the data on users and context have been collected, the next step is to structure the information gathered. The acquired data is categorized, grouped, and systematized to identify significant trends during this phase. The essence of this stage is the detection of recurring insights and patterns from multiple data analyses. Principles are proactive and generative statements, criteria, or assumptions that facilitate conceptual thinking and perspectives visualizing the future.
5. Explore the concepts: this phase emphasizes organized ideation to identify opportunities and develop innovative concepts, building on ideas, principles, and criteria established in previous phases. Exploration occurs at the micro level to address specific issues and at the macro level to identify global issues.
6. Frame the solutions: the concept exploration process yields numerous ideas, necessitating their assessment to pinpoint those most beneficial to stakeholders. Additionally, organizing these concepts into practical categories and hierarchies is crucial. During this phase, iterative prototyping is employed to scrutinize the concepts to identify unexpected challenges and opportunities early. Specific evaluation criteria are set for each testing cycle, rooted in the innovation’s initial aim and the previously proposed ideas. The most viable conceptual systems are then delineated as comprehensive solutions.
7. Realize (implement) the offerings: after devising potential solutions and assessing prototypes, evaluating these solutions is imperative for implementation. Upon identifying high-value solutions, the next step involves developing implementation strategies. Crafted roadmaps outline the anticipated development. Stakeholders receive these strategic plans illustrating the essential actions required to accomplish the solution.
Figure 1. Representation of the Design Innovation Process proposed by Vijay Kumar (source: own representation).
The methodological sequence was adopted based on the steps established in the Design Innovation Process, making it possible to develop idea-generating mechanisms leading to the proposal presented in this article. The steps adapted for the design and development of the MAICC educational model are detailed below:
3.1.1 Sense the intention
The educational objectives of the model are defined. In this case, the design and implementation of an advanced educational model rooted in the principles of Education 5.0 and Open Science were conceived to foster critical and scientific thinking among higher education students, employing emerging technologies and participatory methods to acquire and apply complex knowledge, based on the students’ assessment skills. This approach prepares self-directed and collaborative learners for knowledge generation, innovation, and research, enabling them to address significant challenges in their disciplines and contribute effectively to societal progress.
3.1.2 Know the context
Investigate the higher education environment to identify needs, challenges, and opportunities. This involves understanding the current dynamics in higher education to develop complex thinking from Education 5.0 and Open Science perspectives. At this stage, it was seen that in the context of accelerated digital transformation, educational institutions must adapt their teaching methods to prepare students for the challenges of the 21st century. Likewise, from the perspective of Education 5.0, great emphasis must be placed on the personalization of learning, adapting to the individual needs, skills, and rhythms of each student, and the principles of Open Science with free and open access to educational resources, research data, and scientific results, achieving the democratization of knowledge (UNESCO, 2021). To achieve the above, teachers must adopt educational technologies that allow the creation of richer and more interactive educational experiences to stimulate competency development (Ogwu et al., 2022; Mhlongo et al., 2023; Technology in education − 2023 GEM Report, 2023).
3.1.3 Know the people
Engage students, teachers, and participants to understand their expectations, needs, and interactions in the educational dynamics that foster complex thinking.
3.1.4 Frame the insights
Analyze the gathered information to identify unmet needs and potential areas of innovation. At this stage, in the first step, part of the research team observed the dynamics of learning strategies implemented in the courses of their institution, which requires the development of complex thinking. The researchers perceived the lack of innovative approaches and technological implementation for this purpose. Also, the literature review identified that universities face the challenge of training in complex thinking skills for future professionals to face the demands of today’s world (Patiño et al., 2023); thus, students are unable to apply theoretical knowledge in real-world environments, which limits their competency development (Cruz-Sandoval et al., 2023). The second step determined that citizen science projects should be assessed as a convenient way to develop the intended competencies. The proposal of Sanabria-Z et al. (2022) argued that citizen science projects where students were evaluators would comprise an instrument of analysis and assessment that would enable them to develop intended competencies, including complex thinking.
3.1.5 Explore the concepts
Generate various ideas on how students can develop the sub-competencies of complex thinking (critical and scientific thinking) by evaluating citizen science projects. In this stage, through continuous sessions and brainstorming, the researchers collected ideas for instructional strategies that could be implemented to develop the students’ sub-competencies in complex thinking. Table 1 shows some of the proposals that emerged. They were classified into key categories that reflect different aspects of the learning process and the development of competencies.
Table 1. Brainstorming instructional strategies for developing complex thinking sub-competencies (source: own representation).
3.1.6 Frame solutions
Among the research team, the most promising ideas were selected and developed into concrete proposals. With this, a detailed implementation plan was established, including learning objectives, assessment activities, necessary resources, and success criteria. The plan outlines the pedagogical methodology, specifying the digital tools that facilitate the evaluation. In this phase, the research team conducted a series of meetings to generate structured ideas to explore possibilities. The meetings and inputs were documented to record the construction of the MAICC Model.
• Pedagogical approach
It was determined that the pedagogical approach underpinning MAICC would be IBL combined with SL elements. This combination fosters an experiential education where students investigate and analyze citizen science projects with a social impact. The instructional design, based on the IBL and SL binomial, is structured in modules that include recognition of relevant citizen science projects, analysis and critical evaluation of these projects through the typology developed in “Threshold for Citizen Science projects” (Sanabria-Z et al.,2022), active participation in data collection and analysis, and reflection on the societal impact of these projects and the students’ learning.
• Digital tools
Developing competencies through evaluating citizen science projects requires incorporating digital tools that facilitate evaluation and critical analysis. For this purpose, a free-access educational platform was designed according to the established objectives. This entailed establishing the general tasks and facilities that the platform should provide to the participating students, teachers, researchers, and platform managers.
3.1.7 Realize the offerings
In this phase, the educational model constructs a roadmap that shows its progression in several phases and the platform interface that will function as a complementary tool. The model was implemented in a controlled environment with a group of 89 students as a pilot test. Students’ opinions were collected through focus group sessions.
Figure 2 summarizes the process followed by the research team to build the proposed MAICC Model.
Figure 2. Representation of the steps followed by the research team to conceive the MAICC Model (source: own representation).
3.2 Methodology for measuring the implementation of MAICC model
The proposed model was implemented with the aim of observing the behavior of the participants, obtaining their perceptions of the educational experience and studying the evolution of the perception of complex thinking and its sub-competences, with special interest in critical thinking and scientific thinking. This was achieved through the application of two tests, using the eComplexity instrument (Vázquez-Parra et al., 2024), on the perception of complex thinking: a pre-test before the start of the proposed activities and a post-test at their conclusion, and in addition, a record of a qualitative report that the students completed in one of the MAICC activities was retrieved to obtain their perceptions.
The data collection and analysis process were carried out following a mixed methodology, which included qualitative and quantitative analyses to assess student participation and the development of key competencies. A group of 89 higher education students were recruited through open calls in educational institutions. The selection criteria included voluntary participation and representation from various academic disciplines. Participants were incentivized with institutional recognition to ensure a broad and diverse demographic representation, including different genders, ages, and educational levels. Assessment instruments were used to establish a baseline and measure progress in critical and scientific thinking competencies. The students completed the eComplexity instrument at the beginning and end of the study, which allowed their perception of developing their complex thinking competencies to be measured. Also, specific rubrics, based on the citizen science project threshold proposed by Sanabria-Z et al. (2022), were developed to evaluate citizen science projects, which detailed the criteria for evaluating the projects’ scientific validity and social relevance. Finally, surveys were used to collect qualitative data on the students’ experience, their perceptions of the model, and the impact on their competency development.
The quantitative data collected through pre-test and post-test were analyzed using statistical methods to assess significant changes in critical and scientific thinking competencies, particularly the Wilcoxon test, which was used to show significant improvements in the different dimensions of complex thinking after the educational intervention through MAICC. The qualitative data obtained from the report activity enabled a thematic analysis to be made of the perception of learning obtained in the work carried out on the MAICC platform. This process involved coding responses to identify recurring themes and patterns in students’ experiences and perceptions. This approach allowed for a deep understanding of the MAICC Model’s impact from the participants’ perspective, complementing the quantitative findings. This combination of qualitative and quantitative methods provided a comprehensive and multifaceted assessment of the impact of the MAICC Model. The use of multiple assessment instruments and the inclusion of diverse perspectives ensure the validity and reliability of the data collected, reinforcing the hypothesis that the model effectively enriches higher education by developing students’ critical and scientific thinking competencies.
Lastly, OpenAI’s ChatGPT-4, based on the GPT-4 architecture, was utilized to aid in the initial drafting and idea development. The content generated was then thoroughly reviewed and refined by the authors to ensure it accurately reflected the research objectives.
4 Results
This section is divided into two parts. First, we will show the results of the model arrived at by applying Kumar’s methodology. Subsequently, we will show the quantitative and qualitative results regarding the implementation of the model with a group of students, where the level of perception of complex thinking was measured through a pre-test, prior to the pedagogical activities proposed in MAICC, and its comparison with a post-test at the end of the activities.
4.1 Description of the MAICC model
The search for educational strategies that promote developing critical and scientific thinking led to the design of an innovative educational model, the MAICC Model, for higher education students. It focuses on the active role of students as evaluators of citizen science projects using a specific technological platform. The model deployed, which emerged from the researchers’ follow-up of Kumar’s (2009) Design Innovation Process, assumes fundamental pillars that blend Inquiry-Based Learning, Service-Learning and Citizen Science, as an educational framework. The model seeks to develop students’ analytical and critical skills and encourage active and conscious participation in scientific research.
The theoretical basis of the MAICC Model is based on the premise that meaningful learning occurs when students are actively involved in the educational process, participating in real situations that demand the application of their knowledge and skills. By integrating Inquiry-Based Learning and Service-Learning, we propose an education that transcends the limits of the formal classroom and can be implemented and integrated into a transversal curriculum design with courses that seek to develop complex thinking competencies. By assessing citizen science projects, students acquire the ability to apply rigorous analytical and methodological standards, thereby developing a deep understanding of research processes and the societal implications of scientific endeavors. This approach enables students to acquire solid and critical knowledge, preparing them to face complex challenges and contribute meaningfully to society.
An innovative element of the model lies in implementing a specific technological platform for evaluating citizen science projects. This tool was designed with a modular structure (see Figure 3) that enabled students to work independently under guided instruction from teachers. The platform’s modularity facilitates the intuitive accomplishment of activities; the assessment process is based on Sanabria et al.’s citizen science threshold proposal (2022). It promotes interactions among those involved (students, researchers, teachers, and the future managers of the evaluated citizen science projects). The platform allows hosting the results and findings through open educational resources created by those who evaluate citizen science projects.
Figure 3. Learning experience architecture and tools hosted on the platform (source: own representation).
The platform’s functionalities begin with student registration and the completion of a pre-test survey to assess their initial perception of complex thinking. After the evaluation process, the participants retake the post-test survey to measure their development of complex thinking skills after their activities on the platform. The measurement instruments enable formative and continuous assessment to monitor the model, analyze the results, and organize stakeholders’ focus groups. Figure 3 shows the architecture of the learning experience supported by the platform.
The Inquire Based-learning approach supports the four modules of activities in the MAICC platform, from the project assignment, students carry out research activities around the assigned project, which allows them to analyse and discern information to characterize the project and document the findings, then, in the second module, participants learn about the typology “Threshold for Citizen Science projects” (Sanabria-Z et al., 2022) with which they obtain the conceptual tools that allow them to evaluate the project according to the typology. Subsequently, in the qualitative reporting activity, module 3, students reflect on the previous activities and generate a report that allows them to collect their reflections and analysis. Finally, in module 4, participants design an Open Educational Resource, which allows them to work on a synthesis of the research carried out and develop their creative abilities to communicate and show the findings of the assigned project. The Service-Learning approach is specifically integrated in modules 2 and 4, where students carry out the analysis and evaluation of the impact of the assigned project, as well as the reporting of results from an open educational resource. Such evaluations are useful tools to guide the planned design of the impact of new citizen science projects. Consequently, this practical involvement equips students with the essential skills to contribute to future citizen science initiatives, thus making an impact in the field.
The training and continuous development of professors interested in implementing the MAICC Model is essential to ensure its effectiveness. Teachers are crucial facilitators of learning, guiding students in their process of inquiry and critical reflection. Teachers and students training on the technological platform’s use, together with pedagogical design, ensure a successful implementation of the model and the creation of a dynamic and reflective learning environment. The teachers who wish to implement the MAICC Model can comprise a community sharing experiences, strategies and resources to establish collaborative spaces. The model’s scalability from its flexible design allows for participation beyond one specific audience. Moreover, evaluating citizen science projects can include objectives beyond developing complex thinking. Considering all this, the integration of this technology with the pedagogical approaches adopted reinforces experiential learning and the development of critical digital skills for the 21st century.
The proposed MAICC Model compellingly addresses contemporary educational demands by promoting the development of critical and scientific thinking skills through active participation in citizen science. By integrating mixed pedagogical methodologies and technology, the model can potentially transform higher education, preparing students to be informed and become context-awared since the issues addressed in the citizen science projects. Continuous evaluation of the model is essential for its adjustment and improvement, so evaluation mechanisms have been established to identify its impact on developing students’ competencies. Systematic evaluations enable evidence gathering on the model’s effectiveness in fostering critical and scientific thinking, sub-competencies of complex thinking. This process allows for validating the proposal and adapting it contextually, increasing its impact and relevance. From the preceding, seven principles emerge that comprise the proposed model (see Figure 4).
Figure 4. Representation of the proposed MAICC model for developing complex thinking through citizen science project evaluation (source: own representation).
4.2 Quantitative results
This section presents quantitative data revealing the impact of the implementation of the proposed model in a group 89 of students. The implementation was carried out with two types of students. On the one hand, those taking the course “Research Methodology and Human Factors” (step 2: curriculum design); where the MAICC activities were inserted in the initial stage of the course, configured as complementary and autonomous activities. On the other hand, another group of students participated on a voluntary basis in response to an open call.
The results reflect the changes in the levels of perception of complex thinking and its sub-competences, through the pre-test and post-test surveys applied at the beginning of the MAICC activities and at the end. Here we show the main trends and variations observed.
Wilcoxon test analysis to compare pre-test and post-test results in different dimensions of complex thinking shows significant improvements in all areas assessed. A detailed interpretation of the results is presented below:
In the total comparison of the applied tests of both pre-test and post-test: 62 positive ranks were observed versus 25 negative ranks (Z = −4.096, p < 0.001), indicating a significant improvement in the sum total of the post-test compared to the pre-test. As for the before and after ratio of the systems thinking dimension, of the 89 participants, 51 showed positive ranks and 30 showed negative ranks (Z = −2.541, p = 0.011), suggesting a significant improvement in systems thinking post-test.
In the scientific thinking dimension, there were 55 positive ranks compared to 22 negative ranks (Z = −3.517, p < 0.001), reflecting a significant improvement in post-test scientific thinking. In critical thinking, 50 positive ranks were found compared to 27 negative ranks (Z = −3.217, p = 0.001), indicating a significant improvement in critical thinking after the implementation. Finally, in innovative thinking the result was 53 positive ranks and 16 negative ranks (Z = −3.802, p < 0.001), the data demonstrates a significant improvement in innovative thinking after the implementation.
These results underline the effectiveness of the educational intervention in the development of complex thinking, encompassing the areas of systems, scientific, critical and innovative thinking. The consistency in the improvements observed reinforces the validity of the educational approach implemented to enhance these dimensions of thinking in the participants. Table 2 shows the results obtained in the Wilcoxon signed-rank test of related samples.
The results shown in Table 3 indicate that in all the dimensions assessed (systemic, scientific, critical and innovative thinking), the differences between the pre-test and the post-test are statistically significant, with significance levels p < 0.05. The magnitude of the negative Z-statistic suggests that, in each case, the sum of the ranks of the post-test is lower than that of the pre-test, indicating a substantial improvement in complex thinking skills after the educational intervention.
4.3 Qualitative results
As qualitative results, we collected and analysed the participants’ shared perceptions of the work carried out in the MAICC activities. From the qualitative report on learning acquired activity in module 3, we collected the answers of 84 participants to the question “After the analysis, what learning did you take away with you? The answers of 5 participants were not considered, because the answer to the specific question did not correspond to what was expected.
A content analysis was carried out on the responses obtained, thus patterns were identified and the following classification codes were established:
• CitizenValue: allocated to responses that highlight the usefulness and value of citizen science.
• PracticalExperience: allocated to answers that mention gaining practical experience in analysis or research.
• MethodologyResearch: allocated when students mention learning about research methodologies.
• ImpactApplicability: assigned to answers that discuss the potential impact and applicability of citizen science.
• DiscoveryOrganisations: allocated to responses that mention discovery of organizations or projects.
The results of the analysis of the assigned codes indicate that the category “Impact and Applicability” was the most frequent, with a total of 37 mentions. This reflects a strong concern and interest on the part of participants in how citizen science can have a tangible and applicable impact on real problems. The second most frequent code was “Organisation Discovery” with 31 mentions, suggesting that participants value learning about new citizen science initiatives and projects. Research Methodology” was also a prominent category with 29 mentions, underlining the importance of learning about specific research methods among students. “Citizen Value” had 23 mentions, reflecting the recognition of the value of citizen science in education and the community. Finally, “Practical Experience” had 14 mentions, indicating that although less frequent, students still significantly value learning through practical experience in projects. In conclusion, the results suggest that students find citizen science relevant and valuable not only for its direct impact and applicability, but also for the opportunities it provides to discover new organizations and learn research methodologies, while citizen value and hands-on experience also play important roles in their perception and learning. In the Figure 5 is presented in a visual way.
Something relevant to note is that during the process of participating in MAICC, the research activities carried out by the participants are self-regulated, progressing at their own pace and in their own style. In addition, participants engage in metacognitive processes of analysis and synthesis, both in the evaluation of projects and in the creation of open educational resources such as infographics. This structured but flexible methodology enhances participants’ ability to critically engage with assigned projects and contribute to the evaluation conducted.
5 Discussion
The MAICC Model proves to be an innovative educational strategy that significantly facilitates the development of complex thinking in higher education students. Under the MAICC Model, students showed a notable improvement in their critical and scientific thinking skills, evidenced by the results observed in the pre-test and post-test surveys, their participation in creating OERs (Open Educational Resources) and the quality of their citizen science project evaluations. This is in keeping with the pedagogical approaches adopted, Inquiry-Based Learning and Service-Learning, which emphasizes the importance of direct experience and reflection in learning (Andrini, 2016; Ash, 2009), aligning with the perspective that education must evolve toward practices that prepare students to face complex challenges (Phillips et al., 2018). Therefore, the MAICC Model supports pedagogical approaches to active and participatory learning providing a practical and effective framework for integrating them into higher education, promoting deeper, applied learning.
Active participation in the evaluation of citizen science projects lets students enhance their critical thinking. Through participation in authentic scientific inquiry, students gain tangible knowledge and experience. By fostering enquiry skills through the activities set out in the MAICC model, students are equipped with the tools to analyse complex situations, propose creative solutions and ultimately make informed decisions. This approach combines the strengths of Education 5.0 and Open Science to personalize learning, allowing students to progress at their own pace on a specially designed platform. This tool enriches the educational experience, making it more interactive and accessible. It also facilitates the application of knowledge in practical and real contexts, contributing to a more effective and applied training. Thus, the MAICC Model emerges as an innovative educational strategy that meets current demands and equips students for the future.
The MAICC model incorporates innovative elements that set it apart from other educational models, mainly due to its focus on Citizen Science and educational technology. The effective integration of Citizen Science and technological platforms in higher education has improved student participation and digital competencies. This innovation aligns with the principles of Education 5.0, which emphasizes personalized learning and advanced technologies (Meniado, 2023; Tavares et al., 2023), and also with the principles of Open Science, by fostering education and training in open science practices, making educational resources free and accessible, and promoting process literacy in science, in this case, through the recognition, investigation, analysis and evaluation of citizen science projects. Therefore, the MAICC Model significantly advances educational methodologies, offering a more adaptive and participatory framework for higher education.
6 Conclusion
The aim of this study was to describe the process of creating the MAICC model, which aims to develop critical and scientific thinking in university students by involving them in the evaluation of real citizen science projects using a typology that measures their impact. Therefore, we have described the MAICC model as a dynamic and innovative strategy for contemporary education, aligned with Open Science and Education 5.0 principles. This led to the following being found: (a) the MAICC model effectively promotes complex thinking in higher education students; (b) the evaluation of citizen science projects enhances students’ critical thinking; and (c) the MAICC model is unique in its emphasis on citizen science and educational technology.
Implications for citizen science practice include the possibility of massively training citizens using real-world cases on a platform that allows for a self-paced project evaluation process. The integration of true citizen science projects into educational contexts provides the conditions to foster complex and scientific thinking in students, making them aware of the importance of designing citizen science projects with a multi-impact vision to meet the challenges of the 21st-century. In addition, the inclusion of blended pedagogies, technology, and the connections triggered by students’ engagement in the evaluation of citizen science projects serve to reshape traditional educational practices. Citizen science research also benefits from exposure to this model, as the Kumar-based construction method used can be applied to develop other tools similar to the MAICC platform on different topics. In addition, the process of familiarizing students with citizen science through their assessment of projects based on a typology extends research into evaluation-based learning.
Some study limitations must be addressed. While building collaborative practices requires frameworks that ensure equitable participation to integrate all voices and views on the implementation of a model, the current development was tested only in a local setting. On the other hand, translation into other languages has not been piloted at this stage of the model, which constrains its scalability. The model needs to be tested in multiple settings with a variety of education stakeholders. In addition, the involvement of other sectors, such as student communities, academic administrators or policy makers, is relevant for scaling up the model in a sustainable way. Having the digital platform allows to go beyond geographical boundaries to extend the benefits of the MAICC Model. Future explorations of the MAICC Model for its continuous evaluation pave the way for inspiring digital educational innovations to develop complex thinking sub-competences, in particular critical and scientific thinking. In terms of digital educational platforms, we find open areas of exploration for usability, engagement and technology acceptance. Moreover, research opportunities include the evaluation of methodologies to assess the impact of the model in relation to different educational contexts, initiatives and learning outcomes.
Data availability statement
Publicly available datasets were analyzed in this study. This data can be found here: doi: 10.5281/zenodo.13334351.
Author contributions
PO-M: Conceptualization, Formal analysis, Investigation, Methodology, Software, Writing – review & editing. JS-Z: Conceptualization, Formal analysis, Funding acquisition, Investigation, Methodology, Software, Writing – original draft, Writing – review & editing. JM-E: Investigation, Software, Writing – review & editing. LQ-G: Investigation, Project administration, Supervision, Writing – review & editing. DV-C: Investigation, Project administration, Writing – review & editing. LS-S: Resources, Writing – review & editing. MG-M: Writing – review & editing. AB: Investigation, Writing – review & editing. LM-M: Investigation, Writing – review & editing. IA-I: Investigation, Writing – review & editing.
Funding
The author(s) declare that financial support was received for the research, authorship, and/or publication of this article. The authors would like to thank the Institute for the Future of Education, Tecnológico de Monterrey for the support provided under the following categories: (a) financial and technical support from the ‘Challenge-Based Research Funding Programme 2023’, Project ID #IJXT070-23EG99001, entitled ‘Complex Thinking Education for All (CTE4A): A Digital Hub and School for Lifelong Learners’; (b) financial and technical support from the Writing Lab in the production of this work; and (c) financial and technical support in the production of this work from the Novus fund (Grant Number: ID N22-290).
Acknowledgments
We acknowledge the use of OpenAI’s ChatGPT-4, based on the GPT-4 architecture, to assist in the initial scaffolding of ideas and drafting text for this article. The AI’s contributions were reviewed and refined by the authors to ensure accuracy and alignment with the research objectives.
Conflict of interest
The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.
The handling editor MM declared a shared affiliation with the authors PO-M, JS-Z, JM-E, LQ-G, DV-C, LS-S, MG-M, IA-I at the time of review.
Publisher’s note
All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article, or claim that may be made by its manufacturer, is not guaranteed or endorsed by the publisher.
References
Al Mamun, M. A., and Lawrie, G. (2023). Student-content interactions: exploring behavioural engagement with self-regulated inquiry-based online learning modules. Smart Learn. Environ. 10. doi: 10.1186/s40561-022-00221-x
Alfaro-Ponce, B., Sanabria-Z, J., Rivero-Zambrano, L., and Muñoz-Ibáñez, C. (2023). Citizen Science's influence on public policy for addressing complexity: a systematic review of tech-based projects in higher education. J. Soc. Stud. Educ. Res. 14, 169–199.
Almerich, G., Suárez-Rodríguez, J., Díaz-García, I., and Cebrián-Cifuentes, S. (2020). 21st-century competencies: the relation of ICT competencies with higher-order thinking capacities and teamwork competences in university students. J. Comput. Assist. Learn. 36, 468–479. doi: 10.1111/jcal.12413
Andrini, V. S. (2016). The effectiveness of inquiry learning method to enhance students’ learning outcome: a theoretical and empirical review. J. Educ. Pract. 7, 38–42.
Aramburuzabala, P., and Cerrillo, R. (2023). Service-Learning as an approach to educating for sustainable development. Sustainability, 15, 11231. doi: 10.3390/su151411231
Ash, S. L. (2009). Generating, deepening, and documenting learning: the power of critical reflection in applied learning. J. App. Learn. High. Educ. 1, 25–48. doi: 10.57186/jalhe_2009_v1a2p25-48
Astiswijaya, N., Kusnandi, K., and Dadang, J. (2023). A literature review about one of the successful skills of the 21st century: collaborative ability student. Didaktika Religia 11, 201–222. doi: 10.30762/didaktika.v11i1.3442
Ballard, H. L., Dixon, C., and Harris, E. (2017). Youth-focused citizen science: examining the role of environmental science learning and agency for conservation. Biol. Conserv. 208, 65–75. doi: 10.1016/j.biocon.2016.05.024
Bonney, R., Shirk, J. L., Phillips, T. B., Wiggins, A., Ballard, H. L., Miller-Rushing, A. J., et al. (2014). Next steps for citizen science. Science 343, 1436–1437. doi: 10.1126/science.1251554
Calyx, C., and Finlay, S. M. (2022). Improving a framework for evaluating participatory science. Evaluation 28, 150–165. doi: 10.1177/13563890221085996
Canaleta, X., Vernet, D., Vicent, L., and Montero, J. A. (2014). Master in teacher training: a real implementation of active learning. Comput. Hum. Behav. 31, 651–658. doi: 10.1016/j.chb.2013.09.020
Cooper, S. (2014). Putting collective reflective dialogue at the heart of the evaluation process. Reflective Pract. 15, 563–578. doi: 10.1080/14623943.2014.900019
Cruz-Sandoval, M., Parra, J. C. V., Carlos-Arroyo, M., and Zamora, J. A. (2023). Student perception of the level of development of complex thinking: an approach involving university women in Mexico. J. Latin. Educ. 23, 768–780. doi: 10.1080/15348431.2023.2180370
Deev, M., Gamidulaeva, L., Finogeev, A., Finogeev, A., and Vasin, S. (2020). Sustainable educational ecosystems: bridging the gap between educational programs and in-demand market skills. E3S Web Conf. 208:09025. doi: 10.1051/e3sconf/202020809025
Dwyer, C. P., Hogan, M. J., and Stewart, I. (2014). An integrated critical thinking framework for the 21st century. Think. Skills Creat. 12, 43–52. doi: 10.1016/j.tsc.2013.12.004
Edwards, F., Manderscheid, M., and Parham, S. (2023). Terms of engagement: mobilising citizens in edible nature-based solutions. J. Urban. 1–22, 1–22. doi: 10.1080/17549175.2023.2218356
Haim, A., Shaw, S. T., and Heffernan, N. T. (2023). How to Open Science: a principle and reproducibility review of the learning analytics and knowledge conference. LAK23: 13th international learning analytics and knowledge conference.
Hecker, S., Haklay, M., Bowser, A., Makuch, Z., Vogel, J., and Bonn, A. (2018). Innovation in open science, society, and policy – setting the agenda for citizen science. London: UCL Press eBooks.
Kullenberg, C., and Kasperowski, D. (2016). What is citizen science? – a Scientometric Meta-analysis. PLoS One 11:e0147152. doi: 10.1371/journal.pone.0147152
Kumar, V. (2009). A process for practicing design innovation. J. Bus. Strateg. 30, 91–100. doi: 10.1108/02756660910942517
Kurtulmus, F. (2021). “The democratization of science” in Global epistemologies and philosophies of science. eds. D. Ludwig, I. Koskinen, Z. Mncube, L. Poliseli, and L. Reyes-Galindo (London: Routledge).
LaVelle, J. M., Neubauer, L. C., Boyce, A. S., and Archibald, T. (2023). Setting the stage for critically defined and responsive evaluator education and training. N. Dir. Eval. 2023, 13–22. doi: 10.1002/ev.20542
Ledezma, C., Font, V., and Sala, G. (2023). Analysing the mathematical activity in a modelling process from the cognitive and onto-semiotic perspectives. Math. Educ. Res. J. 35, 715–741. doi: 10.1007/s13394-022-00411-3
Leif, F., Steven, G., and Carol, T. (2023). Improving student motivation and learning outcomes through inquiry learning. World Psychol. 2, 11–25. doi: 10.55849/wp.v2i1.389
Levontin, L., Gilad, Z., Shuster, B., Chako, S., Land-Zandstra, A. M., Lavie-Alon, N., et al. (2022). Standardizing the Assessment of Citizen Scientists’ Motivations: A Motivational Goal-Based Approach. Citizen Science, 7, 25. doi: 10.5334/cstp.459
Lüsse, M., Brockhage, F., Beeken, M., and Pietzner, V. (2022). Citizen science and its potential for science education. Int. J. Sci. Educ. 44, 1120–1142. doi: 10.1080/09500693.2022.2067365
McKiernan, E. C., Bourne, P. E., Brown, C. T., Buck, S., Kenall, A., Lin, J., et al. (2016). How open science helps researchers succeed. eLife 5:e16800. doi: 10.7554/eLife.16800
Meniado, J. C. (2023). Digital language teaching 5.0: technologies, trends and competencies. RELC J. 54, 461–473. doi: 10.1177/00336882231160610
Mhlongo, S., Mbatha, K., Ramatsetse, B., and Dlamini, R. (2023). Challenges, opportunities, and prospects of adopting and using smart digital technologies in learning environments: an iterative review. Heliyon 9:e16348. doi: 10.1016/j.heliyon.2023.e16348
Nollmeyer, G. E., and Baldwin, K. A. (2022). Inquiry-based learning: definition, history, and frameworks. London: Routledge.
Nugent, J., Smith, W. S., Cook, L. A., and Bell, M. (2015). 21st-century citizen science: from global awareness to global contribution. Sci. Teach. 82:34. doi: 10.2505/4/tst15_082_08_34
Ogwu, E. N., Emelogu, N. U., Azor, R. O., and Okwo, F. A. (2022). Educational technology adoption in instructional delivery in the new global reality. Educ. Inf. Technol. 28, 1065–1080. doi: 10.1007/s10639-022-11203-4
Ozden, P., and Velibeyoglu, K. (2023). Citizen science projects in the context of participatory approaches: the case of Izmir. J. Des. Res. Arch. Plan. 4, 31–46. doi: 10.47818/drarch.2023.v4i1081
Patiño, A., Ramírez-Montoya, M. S., and Ibarra-Vazquez, G. (2023). Trends and research outcomes of technology-based interventions for complex thinking development in higher education: a review of scientific publications. Contemporary. Educ. Technol. 15:ep447. doi: 10.30935/cedtech/13416
Phillips, T., Porticella, N., Constas, M. A., and Bonney, R. (2018). A framework for articulating and measuring individual learning outcomes from participation in citizen science. Citizen Sci. 3:3. doi: 10.5334/cstp.126
Reddy, P., Sharma, B., and Chaudhary, K. (2020). Digital literacy: a review of literature. Int. J. Techno. 11, 65–94. doi: 10.4018/IJT.20200701.oa1
Roche, J., Bell, L., Galvão, C., Golumbic, Y., Kloetzer, L., Knoben, N., et al. (2020). Citizen science, education, and learning: challenges and opportunities. Front. Sociol. 5:613814. doi: 10.3389/fsoc.2020.613814
Sala Sebastià, G., Barquero, B., and Font, V. (2021). Inquiry and modeling for teaching mathematics in interdisciplinary contexts: how are they interrelated? Mathematics. 9:1714. doi: 10.3390/math9151714
Sala Sebastià, G., Font, V., Giménez, J., and Barquero, B. (2017). “Inquiry and modelling in a real archaeological context” in Mathematical modelling and applications. International perspectives on the teaching and learning of mathematical modelling. eds. G. Stillman, W. Blum, and G. Kaiser (Cham: Springer), 325–335.
Sanabria-Z, J., Molina-Espinosa, J.-M., Alfaro-Ponce, B., and Vycudilíková-Outlá, M. (2022). A threshold for citizen science projects: complex thinking as a driver of holistic development. RIED-Revista Iberoamericana de Educación a Distancia 25, 113–131. doi: 10.5944/ried.25.2.33052
Sánchez, A., Font, V., and Breda, A. (2022). Significance of creativity and its development in mathematics classes for preservice teachers who are not trained to develop students’ creativity. Math. Educ. Res. J. 34, 863–885. doi: 10.1007/s13394-021-00367-w
Sarva, E., Slišāne, A., Oļesika, A., Daniela, L., and Rubene, Z. (2023). Development of education field student digital competences—student and stakeholders’ perspective. Sustain. For. 15:9895. doi: 10.3390/su15139895
Shyshkina, M. (2024). “The methodology for using the cloud-based Open Science Systems in Higher Education Institutions” in Towards a hybrid, flexible and socially engaged higher education. eds. M. E. Auer, U.R. Cukierman, E. Vendrell Vidal, E. Tovar Caro. (Cham: Springer), 287–294.
Siddiq, F., Olofsson, A. D., Lindberg, J. O., and Tomczyk, L. (2023). What will be the new normal? Digital competence and 21st-century skills: critical and emergent issues in education. Educ. Inf. Technol. 29, 7697–7705. doi: 10.1007/s10639-023-12067-y
Tavares, M., Azevedo, G., Marques, R., and Bastos, M. (2023). Challenges of education in the accounting profession in the era 5.0: a systematic review. Cogent Bus. Manage. 10:2220198. doi: 10.1080/23311975.2023.2220198
Technology in education − 2023 GEM Report. (2023). Available at: https://gem-report-2023.unesco.org/technology-in-education/
Theobald, E., Ettinger, A., Burgess, H., DeBey, L., Schmidt, N., Froehlich, H., et al. (2015). Global change and local solutions: tapping the unrealized potential of citizen science for biodiversity research. Biol. Conserv. 181, 236–244. doi: 10.1016/j.biocon.2014.10.021
Thornhill-Miller, B., Camarda, A., Mercier, M., Burkhardt, J., Morisseau, T., Bourgeois-Bougrine, S., et al. (2023). Creativity, critical thinking, communication, and collaboration: assessment, certification, and promotion of 21st century skills for the future of work and education. J. Intelligence 11:54. doi: 10.3390/jintelligence11030054
Turrini, T., Dörler, D., Richter, A., Heigl, F., and Bonn, A. (2018). The threefold potential of environmental citizen science - generating knowledge, creating learning opportunities and enabling civic participation. Biol. Conserv. 225, 176–186. doi: 10.1016/j.biocon.2018.03.024
UNESCO. (2021) Recommendation on Open Science. United Nations Educational, Scientific and Cultural Organization. Available at:https://unesdoc.unesco.org/ark:/48223/pf0000379949.locale=en
Vázquez-Parra, J. C. H., Rodriguez, L. C., Lis-Gutiérrezc, J. P., Castillo-Martínez, I. M., and Suárez-Brito, P. (2024). eComplexity: validation of a complex thinking instrument from a structural equation model. Front. Educ. 9:1334834. doi: 10.3389/feduc.2024.1334834
Wehn, U., Gharesifard, M., Ceccaroni, L., Joyce, H., Ajates, R., Woods, S., et al. (2021). Impact assessment of citizen science: state of the art and guiding principles for a consolidated approach. Sustain. Sci. 16, 1683–1699. doi: 10.1007/s11625-021-00959-2
Keywords: citizen science, Open Science, complex thinking, higher education, educational innovation
Citation: Olivo-Montaño PG, Sanabria-Z J, Molina-Espinosa JM, Quintero-Gámez L, Velarde-Camaqui D, Sánchez-Salgado LA, Gonzalez-Mendoza M, Breda A, Morales-Maure L and Alvarez-Icaza I (2024) MAICC model: development of complex thinking through citizen science project evaluation. Front. Educ. 9:1392104. doi: 10.3389/feduc.2024.1392104
Edited by:
Miguel A. Montoya, Institute for the Future of Education, SpainReviewed by:
Ann Borda, The University of Melbourne, AustraliaDerling Jose Mendoza Velazco, National University of Chimborazo, Ecuador
Copyright © 2024 Olivo-Montaño, Sanabria-Z, Molina-Espinosa, Quintero-Gámez, Velarde-Camaqui, Sánchez-Salgado, Gonzalez-Mendoza, Breda, Morales-Maure and Alvarez-Icaza. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.
*Correspondence: Jorge Sanabria-Z, jorge.sanabria@tec.mx