Skip to main content

SYSTEMATIC REVIEW article

Front. Educ., 20 June 2024
Sec. Digital Education

Assessment of accessibility in virtual laboratories: a systematic review

  • School of Computing, University of Eastern Finland, Joensuu, Finland

In an era of rapid evolution in educational technologies, Virtual Labs (V-Labs) have emerged as a promising solution, fundamentally altering how learners engage with scientific concepts and experiments. Despite their potential, ensuring their effectiveness and inclusivity in terms of accessibility to diverse students remains a challenging task. Currently, there is limited insight into the accessibility of V-Labs, a gap that our study aims to address. This study seeks to ascertain the effectiveness of V-Labs in terms of accessibility and inclusivity. We synthesized empirical studies, reviewing 36 articles published between 2000 and 2023. Of these articles, 69% of the studies were conducted in higher education and covered a wide range of learning environments. Our study revealed that 47.3% of the studies focused on various engineering subjects. Our findings provide insight into V-Labs’ accessibility from four key perspectives: (a) students with limited abilities, (b) diverse cultural and linguistic backgrounds, (c) instructional design features and content availability, and (d) interaction supporting features. We also identified existing gaps in the accessibility of the V-Labs from the four perspectives. Furthermore, we examined the assessment methods of V-Labs, shed light on the aspects that are evaluated, and underscored the need for future work on assessment strategies.

1 Introduction

The emergence of the Virtual educational environment, specifically Virtual Laboratories (V-Labs), has transformed the traditional teaching approaches to modern scientific experimentation for students and researchers, opening new and innovative ways to explore and understand complex phenomena. A virtual lab is a digital simulation environment in which experiments or simulation activities are conducted virtually without the need for a physical lab. Recently, virtual labs have been widely used across the primary, secondary, and higher education sectors due to their ability to engage students and educators through simulation in teaching and learning settings.

While V-Labs have been found effective in enhancing student learning outcomes when compared to traditional labs, they provide special benefits for enabling experimental analysis across several disciplines (Al-Nakhle, 2022). The benefits attributed to cost efficiency by eliminating physical space and equipment (Apiola et al., 2022; Elmoazen et al., 2023), providing a safe environment for experiential learning (Potkonjak et al., 2016), time efficiency by elimination setup, and global accessibility (He et al., 2022). Despite its potential, access to the V-Labs relies on reachability, and connectivity of the internet, which may create a gap in disparities of access and hinder inclusivity. Research by Van den Beemt et al. (2023) suggests that V-Labs promote active student engagement; improve learning outcomes and academic performance (Wahyuman et al., 2021), facilitate self-directed learning (Reginald, 2023), and motivate learners through gamified interactions.

To fully utilize V-Labs for teaching and learning, a multifaceted strategy is needed. For instance, the need to ensure proper functionality, regular maintenance, and use of the latest technology are essential. While V-Labs should be designed to captivate learners, make experimentation fun, and integrate into curricula emphasizing the practical application of course content; studies focus, for example (Sapriati et al., 2023), on evaluating the effectiveness and outcome of using V-Labs in various disciplines. Besides, incorporating accessible features in V-Labs for diverse learners, including those with visual or other impairments, is essential. Due to this, the effectiveness of V-Labs accessibility depends on how they are implemented and integrated into the learning environment.

Several reviews have recently explored many facets of V-Labs. For example, some studies address barriers, challenges, and possible advances in their area (e.g., Alnagrat et al., 2021; Deriba et al., 2023). Other reviews have highlighted trends and advancements in the academic landscape of V-Labs (Raman et al., 2022). In addition, specific studies examined the implementation and benefits of V-Labs within various disciplines and courses (Fan et al., 2021; Reeves and Crippen, 2021). The remaining examined the application of learning analytics and assessment methods within V-Labs’ environment (Elmoazen et al., 2023). However, it is important to note that there are limited studies that focus on examine the accessibility of V-Labs and exploring the current state V-labs environment. To address these gaps, our study poses two key research questions:

(RQ.1) What essential accessibility features are integrated into V-Labs to cater to diverse user needs?

(RQ.2) How do existing studies assess the effectiveness of accessibility of V-Labs and which aspects do they assess?

This paper provides insights from a systematic review, making at least the following contributions:

• Identifying diverse virtual learning environments across various disciplines, focusing on V-Labs to cater for those with specific educational needs.

• Examining the assessment of V-Labs with a particular focus on accessibility features usability, user experience, and engagement of educators and students.

• Examining and discussing the accessibility features of V-Labs and addressing gaps in accommodating users with disabilities, diverse cultural and linguistic backgrounds, instructional design characteristics, and interaction design support.

• Examining the gaps for future research and development of enhanced inclusivity in virtual learning environments.

The remaining section of our study is organized as follows: Section two reviews the existing state-of-the-art focusing on accessibility, inclusiveness, and the existing challenges of the V-Lab. Section three outlines the methods employed to investigate the study. Section four presents the synthesized results from the studies, and discussions are presented in section five. Finally, we provide the concluding remarks in section six.

2 Background

2.1 Virtual labs across various fields of study

Across various academic disciplines, V-Labs offers an immersive and interactive approach to understanding complex concepts (Esquembre, 2015). In life science, for instance, V-Labs are used to simulate complex biological processes, such as mitosis and other dynamic cellular process phases (Muhamad et al., 2012). These simulations help demonstrate how data on these phases are collected and analyzed. Additionally, V-Labs enables understanding of the experimentation procedures in genetic expressions (Akhigbe and Ogufere, 2020), the structure of DNA, enzyme-controlled reactions, and cell reproduction behavior (Flowers, 2011), and assists in analyzing and visualizing microscopy in teaching (Pylvänäinen et al., 2023). These indicate students and researchers can understand life science’s deep processes without the need for expensive laboratory equipment.

Additionally, V-Labs have been widely used in the physical sciences, including physics, where they are used to simulate a wide range of phenomena such as the motion of a projectile, and to observe how different variables like launch angle, initial velocity, and air resistance affect its trajectory (Chen, 2010; El Kharki et al., 2021). Furthermore, V-Labs in physics employ the use of unobservable light rays as a means to assist students in comprehending the behavior of light (De Jong et al., 2013). This includes understanding how light rays travel, the outcomes when light rays encounter obstacles or colored acetate, and the colors of the rays involved. Similarly, in chemistry, V-Labs are utilized for various activities (Tatli and Ayas, 2010). For example, to experiment with the titration process, observe how chemical changes are made (Tatli and Alipasa, 2012), and how different factors, such as the pH of the solution and the indicator used, affect the results of the experiment (Harvey et al., 2022). This indicates that V-Labs enable students to manipulate different chemical processes and principles, observe reactions, and analyze the outcomes without the need for substances that pose potential risks to human health, the environment, or property due to their physical, chemical, or biological properties.

Furthermore, in engineering disciplines, V-Labs have been used to design, analyze, and test different systems (Balamuralithara and Woods, 2009). For example, in Mechanical Engineering, they enable students to observe the behavior of materials and fluids under different conditions and analyze their properties (Uribe et al., 2016). In Electrical Engineering, V-Labs are utilized for analyzing electrical circuits, and measuring voltage, current, and resistance (Dhang and Kumar, 2023). They also serve to assemble and test electronic circuits, including amplifiers, filters, and digital systems (Hernandez-Jayo and Garcia-Zubia, 2016). In Civil Engineering, V-Labs are employed to analyze forces and deformations in structures such as beams, trusses, and frames (Turkan et al., 2017). Additionally, they are used to experiment with concrete mixtures, strength, and durability in materials (Ramezanianpour and Jovein, 2012).

The field of medicine and healthcare has also witnessed the integration of V-Labs to complement traditional medical practices (Cheung et al., 2023). For example, in anatomy, V-Labs are used for medical students to practice surgical simulation, helping them to understand surgical techniques and procedures (Shen et al., 2006; Prentice, 2016). In pathology, V-Labs are employed to analyze and identify tissue histology structures, cellular abnormalities, and diseases (Barisoni et al., 2020; Kainulainen et al., 2023). V-Labs are also used to diagnose virtual patients, explore the effects of different drugs (Abutaleb, 2020), and learn about various medical procedures through sophisticated virtual simulations (Zare Bidaki, 2018). These simulations offer a safe and controlled environment for honing medical skills and enhancing decision-making abilities.

Although studies show that V-Labs are less commonly used in social science disciplines (Marks and Thomas, 2022), they are beginning to gain traction as a valuable tool. In economics, for example, V-Labs are used to simulate market behavior and test economic theories (Innocenti, 2017). While they remain a relatively new learning environment in many disciplines, they hold the potential to revolutionize the way experiments are conducted and theories are tested. Thus, V-Labs have opened doors across various disciplines, providing a versatile and accessible platform for students and researchers to explore and comprehend concepts, from simple to complex, fostering a practical understanding of knowledge. In this study, we will explore and comprehensively analyze V-Labs across various fields of discipline labs.

2.2 Virtual labs: accessibility features

Different researchers have employed diverse terminology to articulate accessibility attributes in distinct manners when referring to digital tools. For instance, Aljuhani et al. (2018) used usability features to identify their experimental tools’ ease of use, clarity of the experiments’ steps, navigation of the screens, and the interaction and enjoyability of their tools. Chaiyarak et al. (2021) mentioned the universal design element to express the tools easiness of accessing materials, offering customizable options, and helping physical, visual, auditory, sensory, communication, and interest impairments. In Robinson et al. (2017), functionality features were indicated to express the compatibility, portability, and multimedia capability of the tool. The remaining studies termed these as accessibility options. These terms were all interchangeably used to express the accessibility features of digital tools, specifically for V-learning environments. Few studies have attempted to equip and use their digital learning environment with accessibility features. For example, Ertugrul (2000) identified and used 11 criteria (or accessibility features) to consider while building V-Labs, such as user-friendliness, multi-platform portability, compatibility, extendable library, debuggable features, and more.

V-Labs’ features may vary depending on the nature of the proposed field (or subject), the specific aim, and the type of V-Lab. Accessing the accessibility of the lab environment is essential to make sure all students are served equally. A study by Reginald (2023) identified that, without appropriate features of digital tools, user engagement may be obstructed, making it difficult for learners to fully participate in the learning processes. One contributing factor to these issues is the lack of well-defined or established guidelines to evaluate digital learning platforms, particularly V-Labs. In this study, we focus on a comprehensive analysis of accessibility features that are considered in the implementation of V-Labs.

2.3 Inclusive and accessible virtual labs

The true potential of a V-Lab is realized through the integration of inclusivity and accessibility. However, identifying accessible features and developing V-Labs for diverse learners is a challenging task due to various factors. For example, lack of incorporating accessible design features in the lab, such as diverse representation in lab scenarios, responsive design, and the use of multiple modalities (e.g., text, images, video) to deliver information. Similarly, Liu et al. (2022) states that clear and consistent visual design, intuitive navigation, and responsive design are essential features for accessibility to special needs students. The Report of Purdue University (Anderson et al., 2022) identifies one of the challenges faced by individuals with disabilities is the lack of incorporating specialized features in the design. To overcome these issues, Elfakki et al. (2023) developed and evaluated a 3D V-Lab for students with disabilities. Their study demonstrates the effectiveness of the 3D virtual environment in enhancing cognitive skills related to physics subjects.

On the other hand, inclusive V-Labs are required to encompass considerations for various disciplines as well as diverse needs and participants who have different characteristics, backgrounds, abilities, and learning styles. For example, students with different learning modalities (e.g., visual, auditory, kinesthetic), students with diverse cognitive development and abilities, and students with disabilities or special needs. A few studies have investigated diverse learners and students with visual or other impairments. For instance, Thompson et al. (2022) explored inclusive V-Labs for diverse learners and identified that students’ knowledge and confidence improve when they participate in the labs. Similarly, Gavronskaya et al. (2021) examined how inclusive online labs in courses improved students with special needs, especially those who had previously struggled in science classes.

This indicates studies highlighting the need for focused attention and a comprehensive analysis of both the accessibility features and inclusiveness of existing labs.

2.4 Challenges of accessibility in virtual labs

The use of V-Labs in education has grown significantly over the past few years, but making them accessible to all students presents a complex set of challenges, including technological, pedagogical, and infrastructure (Deriba et al., 2023). Technological and infrastructural challenges include the lack of essential design components; difficulty in integration and interoperability with multiple systems (or platforms); and issues with media, file, and executable standards (Vary, 2000). Balancing functionality with ease of use is also a challenging task. For example, according to Leporini and Buzzi (2019), developing V-Labs compatible with assistive technologies is challenging due to the need for consideration of different technologies and compatibility within the V-Lab platform. This indicates necessitating a deep investigation of the accessibility requirements and compatibility environment across different platforms. Similarly, Kapici et al. (2019) underlined that ensuring the accessibility of V-Labs across multiple platforms and devices can be difficult, especially if they depend on specific operating systems. Furthermore, Since V-Labs are widely accessed over the Internet, they are susceptible to issues related to network latency, bandwidth limitations, and connectivity disruptions (Jara et al., 2009). Maintaining a consistent experience across different network conditions and underlying resources required to support V-Labs accessibility (Potkonjak et al., 2016) is essential. For example, ensuring that V-Labs are hosted on reliable servers with sufficient bandwidth is important.

Pedagogically, despite the need for V-Labs to be designed in a way that supports learning objectives and allows all learners to participate fully regardless of their abilities, most V-Labs neglect to do so. For instance, Ji et al. (2022) highlights the challenges in designing V-Labs that offer visual and audio feedback to enhance accessibility for learners with visual impairments. This is due to the need for precise synchronization between the visual and audio data elements (Misiejuk et al., 2023) and ensuring that the feedback is meaningful and effectively conveys the necessary information to the learners. Furthermore, Nedungadi et al. (2018) suggested that incorporating Universal Design for Learning (UDL) principles into V-Labs is necessary to make it more engaging and interactive in the learning environment.

To broaden the scope of designing, developing, and implementing V-Labs, and to address these challenges, our study investigates the identification of essential accessibility features. This endeavor aims to overcome these challenges and ensure that diverse learners have full access to the lab environment.

2.5 Related literature and gap analysis

There have been numerous efforts to synthesize V-Lab research in previous studies. Most researchers have extensively focused on exploring V-Labs’ applications, highlighting their ability to offer superior practical learning experiences compared to traditional labs (Sypsas and Kalles, 2018; Sasongko and Widiastuti, 2019; Elmoazen et al., 2023). Another study (Zhang et al., 2021) assessed the transition from traditional methods to V-Labs, emphasizing their effectiveness in improving student learning outcomes. Specific studies (Byukusenge et al., 2022) attempted to identify commonly taught topics via V-Labs and associated learning outcomes. Additionally, investigations (Sapriati et al., 2023) into students’ self-regulated learning behaviors in V-Lab settings aimed to understand their impact on cognitive, metacognitive, motivational, behavioral, and contextual educational strategies.

On the other hand, the design and effectiveness of V-Lab subject content have been other focal points, and few researchers have examined V-Lab content utilization and assessed specific learning variables (Chan et al., 2021; Reeves and Crippen, 2021). They identified critical elements related to technology and instructional design within V-Labs, comparing their effectiveness against traditional teaching methods and hands-on lab experiences. While several related literature are conducted as shown in Table 1, there is a lack of studies that explored the inclusivity of diverse populations of students including different abilities.

Table 1
www.frontiersin.org

Table 1. Summary of related literature and their focus of study.

Recently, researchers have started to explore accessibility in V-Labs. For example, Deriba et al. (2023) have identified accessibility barriers within V-Labs and call for further study to explore accessibility features of the V-Labs. Thus, our study motivates us to systematically synthesize V-Lab research focusing on the inclusiveness and effectiveness of accessibility features.

3 Methodology

The authors followed a systematic literature review guideline (Page et al., 2021) and the eight basic steps of systematic reviews described by Okoli and Schabram (2015). These guidelines were used to identify the objective of the review, develop a draft protocol, establish inclusion and exclusion criteria, conduct a comprehensive search for relevant data, and assess the quality of selected articles before writing the review.

First, the authors identified the study’s objective, which was to examine the accessibility features and inclusiveness of V-Labs in addressing research questions. After defining the scope of the review, the authors collaborated regularly to develop a protocol, aimed at minimizing bias during publication selection and data processing. By meticulously planning the review strategy, as outlined by Fink (2019), the protocol ensured reproducibility and consistency. It encompassed various aspects, including information sources, literature search strategy, inclusion criteria, quality assessment of studies, data extraction, and data synthesis methods, following the guidelines established by Kitchenham et al. (2007).

The criteria for including and excluding studies [that] used were derived from the research questions (Kitchenham, 2004). To be considered for inclusion, the reviewed articles needed to address the accessibility features, and accommodate diverse learning environments in V-Labs. Table 2 shows the inclusion and exclusion criteria that were applied in the selection process.

Table 2
www.frontiersin.org

Table 2. Inclusion and exclusion criteria used in the study.

3.1 Database and literature search

The authors selected three well-established databases for their search: Scopus, Web of Science (WoS), and ERIC. Both Scopus and WoS databases apply stringent inclusion criteria for journals and conferences, possess a robust meta-data system, and are widely recognized as reputable sources for literature searches (Kumpulainen and Seppänen, 2022). Additionally, ERIC is an educational database encompassing a diverse array of educational literature (Robbins, 2001).

We conducted multiple search iterations using different combinations of keywords. Initially, we searched by combining the keywords “virtual laboratory” with “accessibility,” which severely restricted our findings, causing us to miss several relevant papers. This limitation occurred because some authors did not use the word “laboratory” in their study. They were represented in various forms, such as lab, labs, or laboratories, which resulted in those papers being overlooked by the initial keyword selection. To address this issue, we decided to broaden the search and retrieve any article containing the keyword “Virtual lab*” with a wildcard. As well as “accesssib*,” for accessibility, accessible. Then, using the expertise of researchers, we qualitatively identified which of these articles were truly about virtual laboratories. So, the search formula that produced the most effective results: (“Virtual Lab*”) AND (“Accessib*”) are applied to the metadata of the papers in each database.

The chosen combination of keywords was utilized to search the fields of article titles, abstracts, and author keywords. The search was conducted over 4 days, starting from the 10th of May, 2023. In total, 392 articles were retrieved from the three databases: 277 articles from Scopus, 92 articles from WoS, and 23 articles from the ERIC database. The retrieved studies were conducted from 2000 to 2023. Subsequently, all these articles were uploaded to the Rayyan Systematic tool (Ouzzani et al., 2016) for analysis.

Initially, 64 studies were excluded because they were not peer-reviewed original articles. Afterward, duplicate articles (n = 115) were removed, resulting in 213 unique articles. The first set of researchers reviewed and assessed the abstracts, titles, and keywords of the first 100 papers. During this process, studies that were categorized as “maybe” were discussed, and the full texts were examined for assessment to resolve any uncertainties. The remaining articles were then independently reviewed and filtered by both the first and second researchers. A meeting was held afterward, where all authors discussed any ambiguous cases and solved them by consensus. Based on the predefined inclusion and exclusion criteria, the initial title and abstract scan identified 107 publications that were deemed suitable for a full-text review (see Figure 1). After the full text was included and studies identified, data extraction was performed by the first author.

Figure 1
www.frontiersin.org

Figure 1. Study selection process.

To acquire data from the included articles, relevant information was systematically collected using a codebook (Büchter et al., 2020) and followed the codification (Heikkinen et al., 2022), as presented in Table 3. This approach aimed to minimize individual differences between the reviewers. The first 10 studies were coded by the first author, and then a meeting was held to discuss any discrepancies with the second author and to finalize the codebook before proceeding with coding the remaining articles. Subsequently, a quality check was conducted on the retrieved papers before commencing the synthesis stage. During this phase, the authors organized all the gathered and discussed data within the framework of the review hypothesis (Webster and Watson, 2002).

Table 3
www.frontiersin.org

Table 3. Codification schema used for the study.

Based on the codification schema, the extracted information consists of various categories. These includes extracting the characteristics of the study such as descriptive statistics, study design, participant details, types of V-Labs, the specific disciplines for which the V-Labs were developed, educational settings and levels, V-Labs development environment, V-Lab evaluation measure, and metrics. In addition, accessibility design features, and instructional design accessibility were extracted from the studies guided by Huang’s study (Huang and Benyoucef, 2023).

4 Results

4.1 Study characteristics

The study characteristics are presented in the following sections, encompassing descriptive statistics of reviewed studies, the participants’ educational levels, the topics covered in the V-Labs, the research designs employed, and details about the virtual learning environments developed. Subsequently, we have presented the aspect of accessibility and the evaluation methods utilized to measure the effectiveness of the V-Lab.

4.1.1 Description of included studies

A total of 36 studies were included in this review, which was conducted on four different continents and covering a total of thirteen countries. Figure 2 illustrates that the articles were published between 2000 and 2022, with notable peaks occurring in 2002, 2010, 2017, 2021, and 2022, each featuring three articles. This observation underscores the continued growth and widespread utilization of V-lab in educational sectors. Among the reviewed articles, 24 were presented at conferences, while the remaining 12 studies were publication in peer-reviewed journals.

Figure 2
www.frontiersin.org

Figure 2. Type of studies and year distribution.

4.1.2 Education level

The reviewed studies were conducted across primary, middle school, and higher education. Among the articles examined, the majority of developed labs (69%, N = 25) were tailored for higher education students, encompassing both graduate and undergraduate levels. Within this higher education context, two labs were specifically designed for visually impaired students. Six studies (17%) studies were focused on general education levels (see Figure 3), offering labs that catered to students across all educational levels while others with a specific emphasis on chemistry majors. Furthermore, other labs were developed for researchers and computational scientists. Finally, three studies (N = 8) were conducted for K-12 students.

Figure 3
www.frontiersin.org

Figure 3. Level of education.

4.1.3 Subject

We have looked at the particular field of study area [subject] that the reviewed studies’ V-Lab development addresses. A total of 22 disciplines in science and engineering were included in the reviewed studies. A quarter of the total studies (N = 9; 25%) were conducted for electrical Engineering-related subjects (e.g., Electrical Engineering, electrical and electronic, electrical and computer). This was followed by eight (22%) studies for other engineering subjects (e.g., for Instrumentation and Control, Civil and Structural, Control, Geomatics Engineering, Mechanical and Aerospace Engineering).

Five (14%) studies were in Medical and Biology-related subjects (e.g., Microbiology, Biomedical, cell biology, and Neuroscience), Four (11%) studies were developed for Computer Science, and three studies were for multidisciplinary subjects (e.g., STEM). Additionally, there were also two studies for Chemistry subjects. Finally, the remaining three studies were conducted on Cybernetics, Applied Technology, Defense, and Federal Civilian Agencies subjects, listed as various categorizations (see Figure 4).

Figure 4
www.frontiersin.org

Figure 4. Studies V-Labs categories by subject.

4.1.4 Design used in the studies

Figure 5 presents the study design and evaluation method used in the reviewed study. The studies in this review included various study designs: 50% of the studies used design-based approaches (N = 18), 27.7% were experimental studies (N = 10), and three were case studies (Drigas et al., 2006; Trnka et al., 2016; Li, 2022). The remaining five studies were mixed methods, including design and prototype (Guerra et al., 2009), case study and prototype (Gómez-Chabla et al., 2017), design and pilot study (Ceccacci et al., 2022), design-based study and case study (Almaatouq et al., 2021), and experimental study and case study (Bähring et al., 2004).

Figure 5
www.frontiersin.org

Figure 5. Study design and evaluation method used in studies.

Three different assessment types were used to evaluate their V-labs performance: (1) feedback, (2) self-assessment reports, and (3) pre-test and post-test scores. From the first type of evaluation, 14 studies (38.9%) used student feedback, three studies used student and instructor feedback with various evaluation metrics. From the evaluation of student feedback, six studies used the Likert scale, two used open-ended questions, two studies included attitudinal questions, and another two studies used both the Likert scale and open-ended questions. Notably, two studies only mentioned that their lab was evaluated based on student feedback, but did not provide specific assessment metrics. From instructors’ and students’ evaluation metrics, two studies used the Likert scale, while one did not report evaluation metrics; their study focused on the perceived effectiveness and usability of lab aspects.

The second type of reviewed evaluation assessment was self-testing or lab usage evaluation methods, including two studies (Colak et al., 2014; Almaatouq et al., 2021), that used accuracy of comparison and score rate of outcome evaluation metrics, respectively. Finally, the third evaluation method was pre-post experimental tests by only one study (Dinc et al., 2021), focusing on performance metrics such as experiment completion time and error rate metrics. However, the remaining 16 (44.4%) studies did not provide their evaluation metrics.

4.1.5 Virtual environment

The studies reviewed utilized a variety of learning environments in different areas of study. The Denver Assistance Lab (Watters et al., 2020) was developed for general science. Likewise, Amrita Lab (Achuthan et al., 2011) was designed for science and engineering. The 3D Museum Lab (Ceccacci et al., 2022) and E-Culture Lab (Drigas et al., 2006) were developed for public users and academic purposes.

In the field of computer science, there are labs such as the Virtual Robotic Lab (Gómez-Chabla et al., 2017), the Emperica Lab (Almaatouq et al., 2021), and the Intrusion Detection System Lab (Li, 2022), Electronic Circuit Lab (Tejedor et al., 2008) was developed for computer science lessons. The AR-based Lab (Dinc et al., 2021), and Microsoft Form based lab (Galang et al., 2022) for the chemistry lab. The Medical Imaging Lab (Dikshit et al., 2005) was developed for the biomedical field. The remaining labs were developed for various engineering disciplines. For example, the Microprocessor Lab (Bähring et al., 2004) and the Photorealistic 3D Lab (Adamo-Villani et al., 2006) were designed for Electrical and Computer Engineering. The Satellite Image Processing and Analysis Lab (Buddhiraju et al., 2018) is used for Geomatics Engineering. The Control System Lab (Guerra et al., 2009) used Java-based simulation for Electrical and Electronic Engineering. Similarly, the RLC Circuit Simulation Lab (Colak et al., 2014) employed for Electrical Engineering. The Photon Quantum Lab (Ghazinejad et al., 2021) and Wireless Sensor Networks Lab (Picovici et al., 2008) were developed for Mechanical and Aerospace Engineering.

In addition, the Optical Spectrometry Lab (Thoms and Girwidz, 2017) catered to Physics, while the Go Lab (de Jong et al., 2014) focused on STEM education. CALSB Lab (Yuan and Teng, 2002) was dedicated to Civil and Structural Engineering. The Noise and Machine Vibrations Lab (Redel-Macias et al., 2015) was developed for Electrical and Electronic Engineering. Lastly, the Microelectronics Lab (VijayKumar et al., 2002) was designed using visual 3D form for Electrical Engineering, and the Controller Design Lab (Sánchez et al., 2012) utilized a Java simulator for Control Engineering. The V-lab environment setting and the associated field study for the lab are shown in Figure 6.

Figure 6
www.frontiersin.org

Figure 6. Virtual development environment.

4.2 Aspects of accessibility assessment

In this study, we investigated how the reviewed studies assessed their labs. Out of 36 studies, 15 (42%) examined the effectiveness of their labs, while four studies did not specify their evaluation results [S3, S15, S20, S22]. The effectiveness of these V-labs was evaluated using three different forms: students’ or instructors’ engagement levels, usability and interaction of the system with students/instructors, and user satisfaction and experience with the V-labs.

4.2.1 Usability and interaction with the V-labs

The effectiveness of the developed labs in the reviewed study was evaluated using the usability of the V-Lab and students’ interaction with it. Seven studies [S1, S5, S17, S27, S28, S34, S35] focused on usability and interaction with the system. Each study was evaluated with a different focus on usability and interaction. For example, [S1] evaluated usability in terms of ease of use, user-friendliness, and complexity and concluded that 50% of the system is usable by the users. In addition, the student stated the system’s functionality feature of the interaction was more than 76% and very helpful. In another study [S5], students’ usability was assessed based on the V-Labs resourcefulness in resource-limited environments. In another study [S17], the authors evaluated usability and contribution to education and found that 97% of students’ grades are improved by using visual design animation and online V-Labs support. The study by [S27], showed that the V-Lab was usable with 3.6 out of 5 points in terms of user-friendness, helpfulness, and encouraged motivation and had an effective user interface and adequate content 3.4 out of 5 points. In the study [S28], the authors measured usability in terms of ease of use and access received students responded 45% of usability and 27 responses were in the understanding of the course. In [S34], the improvement in examination results after the introduction of web-based tutorials was observed. In [S35], the students found repetition opportunities, exercises, and feedback helpful. However, unclear instructions and the lack of a Back button created problems, highlighting the need for their inclusion to improve learning.

4.2.2 User experience and satisfaction

Three studies [S9, S13, S14] focused on satisfaction and measured user experience on the system. In [S9], the test results of the study experiments showed that the interactive performance on low-end devices is good, and 90% of students were satisfied with using gaming in the lab. In the study of [S13], students expressed high satisfaction (82%) with the innovative teaching method as well as the new way of teaching, understanding, and knowledge of concepts. In [S14], the students stated that they were satisfied with the acquired lab knowledge and practical skills.

4.2.3 Degree of engagement

In terms of interaction, two studies [S18, S19] assessed the degree of engagement of users’ interaction with the system. The engagement level evaluation showed that 62% of the V-Lab was more engaging compared to other virtual exercises in [S18]. In [S19], students who used V-Labs for summative engagement found them helpful and user-friendly. More than 50% engaged through easy-to-use and accessibility and support the concept of learning materials. Table 4 offers a summary of the reviewed studies’ assessment metrics, evaluation metrics, and study focus.

Table 4
www.frontiersin.org

Table 4. Assessment metrics and aspects.

4.3 Accessibility categories

In this study, several accessibility features were identified and extracted from the reviewed literature. They were then organized into seven main categories based on similar themes (see Figure 7). The reviewed studies addressed accessibility either through the development of the lab environment or the design of prototypes for the V-Lab platform.

Figure 7
www.frontiersin.org

Figure 7. Virtual lab accessibility categories.

4.3.1 Instructional design support

In terms of instructional design, eight (22.2%) studies discussed how their lab environment supports 2D or 3D objects, simulations, or interactive animations [S2, S13, S16, S24, S27, S31, S32, S33]. Followed by seven (19.44%) studies [that] indicate[d] the lab they used supports video or provides audio content descriptions [S3, S12, S17, S21, S25, S28, S32]. In the context of zoom in and out, four (11.1%) studies mentioned improved zooming and viewpoint control [S3, S8, S16, S32].

4.3.2 Interaction support

Regarding the V-lab’s technology interaction supporting platform, 13 (36.1%) studies reported that their labs support adjustable parameters and rich input options [S2, S5, S8, S10, S12, S14, S19, S20, S21, S22, S24, S26, S32], while seven (19.4%) studies were support submission and automatic grading for assessment of learner [S10, S14, S17, S20, S24, S28, S36]. Six (16.66%) studies emphasized the importance and incorporation of immediate and interactive feedback in the lab environment [S5, S15, S18, S28, S30, S31]. Additionally, four (11.1%) studies addressed media controls or playback controls [S5, S8, S12, S13]. Moreover, three (8.3%) studies indicated that their labs supported the trial-error feature [S22, S27, S28], and interestingly, only one study [S16] addressed the specifics of auto-correction, error indication, and suggestions.

4.3.3 Virtual service accessibility

While V-Labs service is expected to be accessible at all times, only a few studies have addressed the availability of virtual services. For example, six (16.7%) studies addressed the 24/7 availability of their V-Labs environment to provide various online activities, projects, exams, and assessments [S1, S3, S7, S10, S19, S31]. While, three studies (8.3%) addressed both offline and online accessibility [S16, S17, S18]. Some studies, for example [S7] reported the user needs to book the lab to use it for a specified time.

4.3.4 Cognitive, visual, and auditory support

Accessibility from the perspective of different forms of (dis)ability showed that only a limited number of lab environments have been developed. Notably, two (5.5%) studies addressed the integration of assistive technologies into their environment, including keyboard equivalents, visual focus indicators, a simple interface with a smart speaker, and voice recognition tools [S3, S15].

4.3.5 Subject-specific accessibility support

In subject-based lab environments, three (8.3%) studies included specific features, including reagents and reactions, as well as user-defined parameters specified in chemistry-related domain lab environments [S9, S21, S25]. Additionally, three (8.3%) studies highlighted that their lab supported text editor, compiler, assembler, interpreter, execution, and test support for computer-related disciplines labs [S2, S16, S20].

4.3.6 Learning analytics and reporting features

Regarding reporting of lab environments, five (13.8%) studies highlighted that their labs support real-time monitoring, and performance measurement [S5, S9, S13, S21, S22]. Four (11.1%) studies reported the ability of lab environments to download and upload students’ reports and statistics [S18, S20, S33, S35].

4.3.7 Other form of accessibility support

We categorized the remaining accessibility support into other categories. Among them, nine (25%) studies discussed the support and compatibility of their lab environment with all operating systems, Internet browsers, PCs, and smartphones [S2, S4, S7, S9, S11, S14, S15, S16, S20]. Additionally, six (16.66%) studies highlighted the integration of games for learning [S9, S12, S13, S22, S26, S34], and another six (16.66%) studies highlighted online tutoring options in the lab environment for diverse learners [S13, S17, S15, S27, S32, S34]. Three (8.3%) studies confirmed the presence of multilingual support [S12, S17, S24]. Only two studies mentioned collaborative tools [S13, S30].

5 Discussion

This section aims to discuss of our main findings guided by the research questions.

(RQ1): What essential accessibility features are integrated into V-Labs to cater to diverse user needs?

To address the research question, we examined the reported studies’ accessibility features of the labs, framed them, and provided them with different perspectives.

Perspective 1: Accessibility of the lab features to users with cognitive, visual, and auditory limited abilities. While there have been studies that developed V-lab environments for students, limited attention has been given to how students with learning disabilities navigate these environments without restrictions on their abilities (Gavronskaya et al., 2021). Gavronskaya emphasized the importance of providing appropriate lab interfaces that focus on students’ abilities, enabling them to gain a sense of mastery through experimentation. However, the review revealed a scarcity of studies addressing the integration of assistive technologies. Only two studies [S3, S15] demonstrated certain features, such as keyboard equivalents, visual focus indicators, simple interfaces with smart speakers, and voice recognition tools, which contribute to the inclusion of users with diverse abilities. For example, students with motor or visual limitations can navigate and interact with V-Labs without needing a mouse, while the integration of smart speakers may facilitate experiments for users with visual impairments.

Despite the inclusion of these features, the studies did not highlight the benefits of such assistive tools on learning outcomes, task completion, or deeper understanding. In this regard, Lin et al. (2017) emphasized the importance of designing teaching activities and the flexible application of technology tools in integrated education. Further exploration is needed to determine practical teaching strategies for enhancing effectiveness using V-labs.

On the other hand, Bong and Chen (2024) raised concerns about the challenges faced by students with disabilities due to the inaccessibility of digital tools and learning materials. Our study suggests that existing V-labs are inaccessible to students with various abilities, including those with cognitive, visual, and auditory limitations. Adhering to digital technology accessibility guidelines (Chisholm et al., 2001) is essential, necessitating a focus on labs designed for different abilities. For example, students with cognitive limitations may benefit from V-labs with simplified interfaces, clear instructions, step-by-step guidance, and interactive feedback. Considering these features and incorporating multiple learning modes and customization options ensures accessibility and accommodates the needs of cognitively limited students. Additionally, addressing the challenges faced by students with limited vision requires V-labs with screen reader compatibility, text-to-speech functionality, keyboard navigation, and compatibility with assistive listening devices (Butler et al., 2017; Bell and Foiret, 2020). Further research on virtual learning tools that address the diverse needs of students with different abilities and explore their impact on the educational process is warranted.

Perspective 2: Accessibility of the lab features for users with diverse cultural and linguistic backgrounds. Considering diversity and equity in education, and exploring multiple approaches to teaching learners while fostering motivation in both practical and laboratory work, are essential requirements for effectively preparing students and facilitating their mastery of practical skills. Hunter et al. (2010) stated that thoughtful design of laboratory activities can positively impact students with diverse educational, social, and cultural backgrounds. However, less attention has been given to how laboratory experiments can promote diversity and inclusion (Nonnemacher and Sokhey, 2022). To promote the equity and inclusion of V-lab to students from different backgrounds, limited studies highlight accessibility features that are beneficial to users from different backgrounds. For example, in the reviewed study, only three studies [S12, S17, S24] indicate the integration of multilingual support in V-Labs. While this feature addresses the language barrier during experiments, promoting linguistic diversity in instructional approaches by offering multilingual support needs further attention. To solve this issue, Hackl and Ermolina (2019) suggest establishing an inclusive cultural feature within laboratories.

Studies explored integrating online tutoring as another feature of V-Lab that engages students in learning through active participation rather than passive observation. While integrating online tutoring into V-Labs, the diverse student is assisted by several means, e.g., either error messages warning students about wrong actions or hints related to the next action of the practice (Paladines et al., 2023). In the context of tutoring, only 16.7% of studies were included, demonstrating the integration of the feature into developed lab environments. The included studies did not explore how tutoring contributes to real-time support for a student who experiences challenges or difficulties during lab sessions.

In addition, the integration of collaborative tools as a feature into V-lab is another feature that supports diverse cultural students. The included studies [S13, S30] highlighted a collaborative tool that promotes teamwork. According to De La Torre et al. (2013), an extension of collaborative tools to V-labs fosters students’ practical skills through working in groups of two or more, mutually searching for understanding and exchanging ideas on difficult concepts. Regarding collaborative tools, several open areas need to be further investigated from this perspective, e.g., Which modalities of online tutoring (e.g., chat, video, audio) are most effective in the context of V-Labs and what contribution does online tutoring make to peer-to-peer learning in the V-Lab environment? How can cultural nuances and language-specific pedagogical approaches be taken into account when implementing multilingual support in V-Labs? These need to be addressed to encourage strong linguistic backgrounds and the inclusion of different cultures to use the V-Labs in academia.

Perspective 3: Accessibility of the lab features in terms of instructional design and content availability. Augmenting instructional design with V-Labs has several effects on student engagement in critiquing experiments, enhancing students’ critical judgment of inquiry processes and their scientific literacy (Liu et al., 2022). This shows various instructional design accessibility features improve educational outcomes using lab experiments. Interestingly, many function-related aspects are addressed in the reviewed studies. For example, educational games (16.7%), immediate and interactive feedback (16.7%), 2D or 3D simulations (22.2%), zooming and viewing angle control (11.1%), and Video or audio content descriptions (19.44%) in V-Labs. Our study extends the idea of instructional design features mentioned in Chan et al. (2021), by identifying further the elements of design features. This customized feature enables the achievement of educational goals by entertaining the educator by integrating the gameplay into the virtual learning environment. The inclusion of 2D or 3D simulations or animations promotes student engagement and adds to the educational experience. On the other hand, improved zooming and viewing angle control are used for different lab forms such as labs designed for students with different (dis) abilities. Adamo-Villani et al. (2006) demonstrated that the ability to zoom in or out feature allows students with limited vision to connect better with small devices.

The availability of V-lab content at any time, place, and on any device is crucial. Several reviewed studies have discussed this issue. For example, 6 out of 36 studies stated that their developed V-Labs are fully accessible to students in both offline and online forms, while one [S7] study required booking lab sessions to access V-Lab services, including exams, projects, and other online experiments. We believe that offering 24/7 services, both online and offline, to accommodate students with varying schedules, time constraints, or residing in different time zones, can significantly contribute to providing equitable access to educational resources for students worldwide. Boukil and Ibriz (2015) support this notion, asserting that cloud-based lab environments offer restricted access to either time or location. In such circumstances, instructors and learners can access lab documentation and resources at their convenience, facilitating individual or collaborative engagement from any location. However, it prompts us to rethink the vulnerabilities, privacy, and security of labs. These labs may present risks of degrading essential and optional educational characteristics and technical difficulties. Another aspect to explore is how the offline accessibility of virtual labs caters to the needs of students who may opt for offline study or face limitations in accessing a stable internet connection. Therefore, further investigation is required.

Perspective 4: Accessibility of the lab feature in terms of interaction design supporting features. Sahin and Gul (2014) raised interesting points regarding the design practices of digital technology, particularly how designers grapple with the complexity inherent in design activities. These insights emphasize the importance of integrating design-supporting features into digital educational tools, including V-Labs. Our study aimed to evaluate the alignment of interaction design support, revealing a thorough examination of numerous accessibility features. Building on this, Wray et al. (2019) emphasized the necessity for users to provide a model of their motivations, goals, and contexts to inform the development of user-centric interventions. However, the incorporation of such considerations was largely limited in the studies reviewed, with only a few addressing integration design accessibility.

Among the 36 studies analyzed, 13 identified adjustable parameters and extensive input options. This focus on customizable parameters is commendable, as it contributes to fostering an inclusive learning environment and encourages creative experimentation. Höök and Löwgren (2021) corroborated the benefits of utilizing digital tools with adjustable parameters and extensive input options to enhance learning experiences.

Furthermore, three studies delved into the utility of trial-and-error features, highlighting their role in facilitating experimentation within structured learning objectives. Horsman (2018) noted that students employing a trial-and-error approach exhibited statistically lower levels of test anxiety compared to their peers. However, the reviewed studies failed to elucidate how trial-and-error features could help destigmatize errors and achieve desired outcomes through error recognition and elimination. Another significant aspect, intelligent suggestions, as well as error display and correction, received limited attention in certain studies [S16]. Although discussed in a different context, Ai (2017) identified corrective feedback as effective in aiding learners to self-identify and rectify issues, albeit with the occasional necessity of intervention from an onsite tutor when the system falls short. Crucial questions pertaining to autosuggestion and error correction in lab experiments, such as the scope of errors autocorrection can address and how V-Labs can provide meaningful feedback on correcting and refining experimental techniques, remain unexplored by the studies included in our analysis.

A study, Ballantyne et al. (2018) examined the design accessibility of universal guidelines within selected mobile applications, thus contributing to a broader comprehension of accessibility evaluation within design contexts. Although their evaluation centered on mobile applications, our study expanded this assessment to the domain of V-Labs. Consequently, there remains untapped potential for further investigation into V-Labs, particularly in the development of virtual learning tools. The integration of features aimed at enhancing accessibility could not only improve the overall user experience but also advance the ongoing evolution of inclusive educational technologies.

RQ.2: How do existing studies assess the effectiveness of accessibility of V-Labs and which aspects do they assess?

The effectiveness of V-Labs has been examined in several ways. For example, Faias et al. (2007) stated it in two ways: either heuristic evaluation conducted by experts based on the usability guidelines or practical tests with users. While there are accessibility guidelines (e.g., WAG) to evaluate the effectiveness of digital tools such as websites, to our knowledge, there are no specified standard guidelines to evaluate the effectiveness of V-Lab. We found that the focus of the evaluation varied based on the study’s aim. For example, Manyilizu (2023) evaluated the effectiveness of the lab on the impact on student learning outcomes.

In this review, while we found the assessment of the accessibility of the V-Labs in three focuses, only less than a third (11 out of 36 studies) of the reviewed studies have evaluated and stated assessment aspects of the V-Labs. Our result aligned with the findings of Lai et al. (2022) that the evaluation of the effectiveness of V-Labs has a gap and requires further investigation. The reviewed study primarily found three viewpoints of evaluation aspects including usability and interaction of the lab with users, user experience and satisfaction of the lab, and level of engagement with the system.

Several topics by different scholars were raised related to evaluating the assessment of usability and interaction of the system. While reviewed studies such as [S1], [S27], and [S28] evaluate the focus on technical usability checklists by users, other studies like [S17] pedagogical usability such as the contribution of the lab to the improvement of their examination in their result or grade. Interestingly, our result also aligned with Kumar et al. (2018) findings regarding the usability analysis of V-Labs, where they identified 15 technical usability checklist items and 21 pedagogical usability checklist items to evaluate the usability of Labs. However, there are still unsolved problems. For example, identifying types of interactions (e.g., simulations, virtual experiments, collaborative features) are most effective for promoting active learning in V-Labs.

Our review report also showed an evaluation of the lab through rating in terms of user experience and satisfaction. Estriegana et al. (2019) investigated students’ perceived satisfaction as a critical issue in better understanding learners’ behavioral intention to use online learning Environments (e.g., V-Lab). In our study, the reports [S13], and [S14] discussed the satisfaction of students with the integration of gamification in the V-Labs, satisfaction with the understanding of course topics, and the practical skills acquired. Further studies are required to examine key indicators of user satisfaction when using V-Labs. The report also showed that they looked at the level of engagement but did not specify the level of interactivity that contributes to student engagement and understanding of complex concepts.

On the other hand, we found that only 42% of studies evaluated V-Labs. In our review, the studies used different assessment methodologies, even for the same accessibility aspects, which may introduce bias in determining the true effectiveness of the developed labs. Although the objective of studies varies, either they need to develop and validate evaluation instruments or adopt developed assessment instruments such as Kumar et al. (2018) technical, and pedagogical instruments.

6 Limitations and future research

Our study acknowledges several limitations in terms of search accessibility and study inclusion. Primarily, we conduct searches and select studies exclusively from three databases, focusing on conference proceedings and journal articles. Our search criteria were limited to two terms, “Virtual Lab” and “Accessib*,” potentially overlooking relevant studies on V-Labs during our search process. We acknowledge that different fields use various terms, such as remote lab, online lab, remote experiment, online experiment; simulation in medical science. Further research could be explored by incorporating these key terms for future studies.

In the context of the study’s focus, we aimed to assess the accessibility of V-Labs. We found that achieving equitable and accessible V-Labs is still a distant goal, prompting us to consider accessibility from four different perspectives. This study also calls for investigations into the accessibility of other emerging virtual education technologies including Learning Management Systems, e-learning platforms, VR, and AR within an educational context for various learning ability students.

Several unanswered questions emerge, necessitating further research. For example, how can the increased accessibility of VR and AR technologies benefit across different education levels and various learner demographics (diverse learners)? How do adaptive learning technologies, such as AI applications, address accessibility issues and enhance the effectiveness of V-Labs or other learning environments in delivering person-centric learning experiences? Additionally, how do students understand accessibility concepts in classrooms during the development of learning environments or systems that are considered for all users? These guided questions serve as a foundation for future work stemming from our studies.

7 Conclusion

Addressing accessibility in V-Labs is pivotal to making students fully participate and promoting diversity and innovation in the field. To harness the potential of V-Labs in the education domain requires a multi-faceted approach that ensures proper functionality and accessibility while addressing existing gaps in accommodating diverse users’ needs. In this work, we reviewed the accessibility of V-Lab technologies by examining how studies address accessibility within the V-Lab environment and specific aspects under assessment. While we emphasize the significance of accessibility features for enriching V-Labs, our study highlighted a predominant focus on higher education institutions in existing V-Labs. However, a noticeable gap has been identified in studies exploring various (dis)abilities in K-12 education, indicating a pressing need for more inclusive and accessible lab development.

The study revealed that understanding the aspects of accessibility such as usability, interaction, and satisfaction of students with the V-Lab is crucial for promoting inclusive learning environments and ensuring equal opportunities for all learners, regardless of their abilities. The study uncovered a diverse array of accessibility categories encompassing instructional design, interaction support, virtual service accessibility, cognitive, visual, and auditory accessibility support, and subject-specific accessibility support. It showcases the multifaceted nature of accommodating diverse learners in V-Lab environments.

Augmenting instructional design with V-Labs positively impacts student engagement and academic literacy. Furthermore, the availability of V-lab content at any time, place, and on any device is fundamental for equitable access to educational resources. However, we found that existing V-labs limited attention has been given to how students with learning disabilities navigate these environments without restrictions on their abilities.

Our review highlights the necessity to evaluate the effectiveness of V-Labs, particularly concerning usability, user experience, satisfaction, and engagement. Further research is imperative to address the current gaps and inconsistencies in evaluating V-Labs, ultimately enhancing their impact on student learning outcomes, and ensuring widespread accessibility.

In conclusion, our review has shed light on issues related to the assessment of accessibility in V-Labs and other emerging virtual educational technologies, emphasizing the need to consider accessibility elements during the design and development of various systems. It has provided valuable insight into areas that still harbor significant gaps, paving the way for future studies to address these shortcomings and contribute to the advancement of accessible and inclusive virtual learning environments.

Data availability statement

The original contributions presented in the study are included in the article/supplementary material, further inquiries can be directed to the corresponding author.

Author contributions

FD: Conceptualization, Formal analysis, Methodology, Writing – original draft. MS: Conceptualization, Supervision, Validation, Writing – review & editing. MT: Funding acquisition, Supervision, Validation, Writing – review & editing.

Funding

The author(s) declare that financial support was received for the research, authorship, and/or publication of this article. This study was funded by the Finnish National Agency for Education (EDUFI), Project No. 293050/DNR 34/02.04.01/2023, which was received by FD. The paper is co-funded by the Academy of Finland for the project TOPEILA, Decision Number 350560 which was received by the second author.

Conflict of interest

The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

The author(s) declared that they were an editorial board member of Frontiers, at the time of submission. This had no impact on the peer review process and the final decision.

Publisher’s note

All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article, or claim that may be made by its manufacturer, is not guaranteed or endorsed by the publisher.

References

Abutaleb, A.-H. (2020). Chameleon–A pedagogical AI-driven tool for doctors and medical students. UK: University of Aberdeen.

Google Scholar

Achuthan, K., Savithryamma, S. K., Surendran, S. K., Diwakar, S., Nedungadi, P., Humphreys, S., et al. (2011). The VALUE @ amrita virtual labs project: using web technology to provide virtual laboratory access to students in Proceedings - 2011 IEEE Global Humanitarian Technology Conference, GHTC 2011, 117–121.

Google Scholar

Adamo-Villani, N., Richardson, J., Carpenter, E., and Moore, G. (2006). A photorealistic 3d virtual laboratory for undergraduate instruction in microcontroller technology in ACM SIGGRAPH 2006 Educators Program, 21.

Google Scholar

Ai, H. (2017). Providing graduated corrective feedback in an intelligent computer-assisted language learning environment. ReCALL 29, 313–334. doi: 10.1017/S095834401700012X

Crossref Full Text | Google Scholar

Akhigbe, J. N., and Ogufere, J. A. (2020). Effect of computer simulation instructional strategy on Students’ attitude and academic achievement in genetics. KIU J. Soc. Sci. 5, 305–315.

Google Scholar

Aljuhani, K., Sonbul, M., Althabiti, M., and Meccawy, M. (2018). Creating a virtual science lab (VSL): the adoption of virtual labs in Saudi schools. Smart Learn. Environ. 5, 1–13. doi: 10.1186/s40561-018-0067-9

Crossref Full Text | Google Scholar

Almaatouq, A., Becker, J., Houghton, J. P., Paton, N., Watts, D. J., and Whiting, M. E. (2021). Empirica: a virtual lab for high-throughput macro-level experiments. Behav. Res. Methods 53, 2158–2171. doi: 10.3758/s13428-020-01535-9

PubMed Abstract | Crossref Full Text | Google Scholar

Alnagrat, A. J. A., Ismail, R. C., and Idrus, S. Z. S. (2021). Extended reality (XR) in virtual laboratories: a review of challenges and future training directions. J. Phys. Conf. Ser. 1874:12031. doi: 10.1088/1742-6596/1874/1/012031

Crossref Full Text | Google Scholar

Al-Nakhle, H. (2022). The effectiveness of scenario-based virtual laboratory simulations to improve learning outcomes and scientific report writing skills. PLoS One 17:e0277359. doi: 10.1371/journal.pone.0277359

PubMed Abstract | Crossref Full Text | Google Scholar

Anderson, J., Anderson, Z., Beaton, K., Bhandari, S., Bultinck, E., Ching, J., et al. (2022). Challenges in inclusiveness for people with disabilities within STEM learning and working environments.

Google Scholar

Apiola, M., López-Pernas, S., and Saqr, M. (2022). “The evolving themes of computing education research: trends, topic models, and emerging research” in Past, present and future of computing education research: a global perspective. Cham: (Springer).

Google Scholar

Bähring, H., Keller, J., and Schiffmann, W. (2004) A combined virtual and remotely accessible microprocessor laboratory in Proceedings of the 2004 Workshop on Computer Architecture Education, WCAE 2004 - Held in Conjunction with the 31st International Symposium on Computer Architecture, ISCA.

Google Scholar

Balamuralithara, B., and Woods, P. C. (2009). Virtual laboratories in engineering education: the simulation lab and remote lab. Comput. Appl. Eng. Educ. 17, 108–118. doi: 10.1002/cae.20186

Crossref Full Text | Google Scholar

Ballantyne, M., Jha, A., Jacobsen, A., Hawker, J. S., and Elglaly, Y. (2018) Study of accessibility guidelines of mobile applications in ACM international conference proceeding series. Rochester Institute of Technology, Rochester, NY, 305–315.

Google Scholar

Bong, W. K., and Chen, W. (2024) Increasing faculty’s competence in digital accessibility for inclusive education: a systematic literature review. International Journal of Inclusive Education. 28, 197–213.

Google Scholar

Barisoni, L., Lafata, K. J., Hewitt, S. M., Madabhushi, A., and Balis, U. G. J. (2020). Digital pathology and computational image analysis in nephropathology. Nat. Rev. Nephrol. 16, 669–685. doi: 10.1038/s41581-020-0321-6

PubMed Abstract | Crossref Full Text | Google Scholar

Bell, D., and Foiret, J. (2020). A rapid review of the effect of assistive technology on the educational performance of students with impaired hearing. Disabil. Rehabil. Assist. Technol. 15, 838–843. doi: 10.1080/17483107.2020.1775317

PubMed Abstract | Crossref Full Text | Google Scholar

Boukil, N., and Ibriz, A. (2015) Architecture of remote virtual labs as a service in cloud computing in 2015 International Conference on Cloud Technologies and Applications (CloudTech), pp. 1–6.

Google Scholar

Büchter, R. B., Weise, A., and Pieper, D. (2020). Development, testing and use of data extraction forms in systematic reviews: a review of methodological guidance. BMC Med. Res. Methodol. 20:259. doi: 10.1186/s12874-020-01143-3

Crossref Full Text | Google Scholar

Buddhiraju, K. M. M., Eeti, L. N. N., and Tiwari, K. K. K. (2018). “A web-based virtual laboratory for satellite image processing and analysis” in International archives of the photogrammetry, remote sensing and spatial information sciences - ISPRS archives, Dehradun, India: Geospatial Technology – Pixel to People 9–13.

Google Scholar

Butler, M., Holloway, L., Marriott, K., and Goncu, C. (2017). Understanding the graphical challenges faced by vision-impaired students in Australian universities. Higher Educ. Res. Dev. 36, 59–72. doi: 10.1080/07294360.2016.1177001

Crossref Full Text | Google Scholar

Byukusenge, C., Nsanganwimana, F., and Tarmo, A. P. (2022). Effectiveness of virtual laboratories in teaching and learning biology: a review of literature. Int. J. Learn. Teach. Educ. Res. 21, 1–17. doi: 10.26803/ijlter.21.6.1

Crossref Full Text | Google Scholar

Ceccacci, S., Giaconi, C, Mengoni, M, Del Bianco, N, and D’Angelo, I (2022) Co-design of immersive virtual learning environments: a pilot study involving people with intellectual disability and SLDs CEUR Workshop Proceedings. CEUR-WS, 1–6.

Google Scholar

Chaiyarak, S., Nilsook, P., and Wannapiroon, P. (2021) An empirical study of intelligent virtual universal learning platforms in 2021 Research, Invention, and Innovation Congress: Innovation Electricals and Electronics (RI2C). IEEE, 66–73.

Google Scholar

Chan, P., van Gerven, T., Dubois, J. L., and Bernaerts, K. (2021). Virtual chemical laboratories: a systematic literature review of research, technologies and instructional design. Comput. Educ. Open 2:100053. doi: 10.1016/j.caeo.2021.100053

Crossref Full Text | Google Scholar

Chen, S. (2010). The view of scientific inquiry conveyed by simulation-based virtual laboratories. Comput. Educ. 55, 1123–1130. doi: 10.1016/j.compedu.2010.05.009

Crossref Full Text | Google Scholar

Cheung, L., Strauss, L., Antonson, P., Soini, S., Kirkham, M., and Fisher, R. M. (2023) Digital labs as a complement to practical laboratory training for bachelor and master biomedicine students.

Google Scholar

Chisholm, W., Vanderheiden, G., and Jacobs, I. (2001). Web content accessibility guidelines 1.0. Interactions 8, 35–54. doi: 10.1145/379537.379550

Crossref Full Text | Google Scholar

Cois, C. A. A., Beckwith, J., Wrubel, J., and Herrman, J. P. (2010) VTE: the virtual training environment: advanced virtual lab authoring and delivery in CSEDU 2010 - 2nd International Conference on Computer Supported Education, Proceedings, 208–214.

Google Scholar

Colak, I., Irmak, E., Kabalci, E., and Issi, F. (2014). Design and implementation of a remote laboratory platform using MATLAB builder for NE. Comput. Appl. Eng. Educ. 22, 617–629. doi: 10.1002/cae.21553

Crossref Full Text | Google Scholar

De Jong, T., Linn, M. C., and Zacharia, Z. C. (2013). Physical and virtual laboratories in science and engineering education. Science 340, 305–308. doi: 10.1126/science.1230579

Crossref Full Text | Google Scholar

de Jong, T., Sotiriou, S., and Gillet, D. (2014). Innovations in STEM education: the go-lab federation of online labs. Smart Learn. Environ. 1:3. doi: 10.1186/s40561-014-0003-6

Crossref Full Text | Google Scholar

De La Torre, L., Heradio, R., Jara, C. A., Sanchez, J., Dormido, S., Torres, F., et al. (2013). Providing collaborative support to virtual and remote laboratories. IEEE Trans. Learn. Technol. 6, 312–323. doi: 10.1109/TLT.2013.20

Crossref Full Text | Google Scholar

Deriba, F. G., Saqr, M., and Tukiainen, M. (2023) Exploring barriers and challenges to accessibility in virtual laboratories: a preliminary review.

Google Scholar

Dhang, S., and Kumar, C. S. (2023). Efficient web-based simulation on analog electronics circuits laboratory. Comput. Appl. Eng. Educ. 31, 777–788. doi: 10.1002/cae.22623

Crossref Full Text | Google Scholar

Dikshit, A., Wu, D., Wu, C., and Zhao, W. (2005). An online interactive simulation system for medical imaging education. Comput. Med. Imaging Graph. 29, 395–404. doi: 10.1016/j.compmedimag.2005.02.001

Crossref Full Text | Google Scholar

Dinc, F., De, A., Goins, A., Halic, T., Massey, M., and Yarberry, F. (2021). “ARChem: augmented reality based chemistry LAB simulation for teaching and assessment” in 2021 19th International Conference on Information Technology Based Higher Education and Training, ITHET 2021 (Conway: Institute of Electrical and Electronics Engineers Inc).

Google Scholar

Drigas, A. S., Koukianakis, L. G., and Glentzes, J. G. (2006). A virtual lab for hellenic cultural heritage. WSEAS Trans. Inf. Sci. Appl. 3, 532–537.

Google Scholar

Drysdale, T. D., and Braithwaite, N. S. J. (2017) An internet of laboratory things, 4th Experiment at International Conference, Algarve: Universidade do Algarve, 236–240. doi: 10.1109/EXPAT.2017.7984362

Crossref Full Text | Google Scholar

El Kharki, K., Berrada, K., and Burgos, D. (2021). Design and implementation of a virtual laboratory for physics subjects in Moroccan universities. Sustain. For. 13:3711. doi: 10.3390/su13073711

Crossref Full Text | Google Scholar

Elfakki, A. O., Sghaier, S., and Alotaibi, A. A. (2023). An efficient system based on experimental laboratory in 3D virtual environment for students with learning disabilities. Electronics 12:989. doi: 10.3390/electronics12040989

Crossref Full Text | Google Scholar

Elmoazen, R., Saqr, M., Khalil, M., and Wasson, B. (2023). Learning analytics in virtual laboratories: a systematic literature review of empirical research. Smart Learn. Environ. 10, 1–20. doi: 10.1186/s40561-023-00244-y

Crossref Full Text | Google Scholar

Ertugrul, N. (2000). Towards virtual laboratories: a survey of LabVIEW-based teaching/learning tools and future trends. Int. J. Eng. Educ. 16, 171–180.

Google Scholar

Esquembre, F. (2015). Facilitating the creation of virtual and remote laboratories for science and engineering education. IFAC-PapersOnLine 48, 49–58. doi: 10.1016/j.ifacol.2015.11.212

Crossref Full Text | Google Scholar

Estriegana, R., Medina-Merodio, J.-A., and Barchino, R. (2019). Student acceptance of virtual laboratory and practical work: an extension of the technology acceptance model. Comput. Educ. 135, 1–14. doi: 10.1016/j.compedu.2019.02.010

Crossref Full Text | Google Scholar

Faias, J., Ferreira, J. M. M., and Leite, C. (2007) Accessibility and usability of online labs: real barriers in a virtual world in Proceedings of the 4th International Conference on Remote Engineering and Virtual Instrumentation (REV 07).

Google Scholar

Fan, Y., Evangelista, A., and Indumathi, V. (2021) Evaluation of remote or virtual laboratories in e-learning engineering courses in 2021 IEEE Global Engineering Education Conference (EDUCON), IEEE, 136–143.

Google Scholar

Fink, A. (2019). Conducting research literature reviews: From the internet to paper. Los Angeles, CA: Sage Publications.

Google Scholar

Flowers, L. O. (2011). Investigating the effectiveness of virtual laboratories in an undergraduate biology course. J. Hum. Resourc. Adult Learn. 7:110.

Google Scholar

Galang, A., Snow, M. A., Benvenuto, P., and Kim, K. S. (2022). Designing virtual laboratory exercises using Microsoft forms. J. Chem. Educ. 99, 1620–1627. doi: 10.1021/acs.jchemed.1c01006

Crossref Full Text | Google Scholar

Gavronskaya, Y., Larchenkova, L., Kurilova, A., and Gorozhanina, E. (2021). Virtual lab model for making online courses more inclusive for students with special educational needs. Int. J. Emerg. Technol. Learn. 16, 79–94. doi: 10.3991/ijet.v16i02.18755

Crossref Full Text | Google Scholar

Ghazinejad, M., Khoshnoud, F., and Porter, S. (2021) Enhancing interactive learning in engineering classes by implementing virtual laboratories in Proceedings - Frontiers in Education Conference, FIE.

Google Scholar

Gómez-Chabla, R., Real-Avilés, K., and Hidalgo, J. (2017) A cloud-based architecture for robotics virtual laboratories, Valencia-García, R., Lagos-Ortiz, K., Alcaraz-Mármol, G., Del Cioppo, J., Vera-Lucio, N., and Bucaram-Leverone, M. (eds) Communications in computer and information science, Springer, Cham.

Google Scholar

González, M. A., Arranz, G., Portales, R., Tamayo, M., and González, A. (2001). Development of a virtual laboratory on the internet as support for physics laboratory training. Eur. J. Phys. 23:61. doi: 10.1088/0143-0807/23/1/309

Crossref Full Text | Google Scholar

Guerra, L. D. D., Ramos Fuentes, G. A., Vargas, H., and Costa-Castelló, R. (2009). A virtual/remote laboratory to illustrate the internal model principle for periodical signals. IFAC Proc Vol 42, 7–12. doi: 10.3182/20091021-3-JP-2009.00004

Crossref Full Text | Google Scholar

Hackl, E., and Ermolina, I. (2019). Inclusion by design: embedding inclusive teaching practice into design and preparation of laboratory classes. Curr. Pharm. Teach. Learn. 11, 1323–1334. doi: 10.1016/j.cptl.2019.09.012

PubMed Abstract | Crossref Full Text | Google Scholar

Harvey, D. T., le, A. P., Lucy, C. A., Mosby, B. M., and Park, E. J. (2022). “The use of simulations with active learning exercises” in Active learning in the analytical chemistry curriculum (ACS Publications), 121–145. doi: 10.1021/bk-2022-1409.ch008

Crossref Full Text | Google Scholar

He, S. L., Kong, D., Yang, J., Ma, L., and Chang, Y. (2022). Research on the teaching mode of university virtual laboratory based on component technology. Int. J. Cont. Eng. Educ. Life Long Learn. 32, 94–110. doi: 10.1504/IJCEELL.2022.121223

Crossref Full Text | Google Scholar

Heikkinen, S., Saqr, M., Malmberg, J., and Tedre, M. (2022). Supporting self-regulated learning with learning analytics interventions - a systematic literature review. Educ. Inf. Technol. 28, 3059–3088. doi: 10.1007/s10639-022-11281-4

Crossref Full Text | Google Scholar

Hernandez-Jayo, U., and Garcia-Zubia, J. (2016). Remote measurement and instrumentation laboratory for training in real analog electronic experiments. Measurement 82, 123–134. doi: 10.1016/j.measurement.2015.12.017

Crossref Full Text | Google Scholar

Höök, K., and Löwgren, J. (2021). Characterizing interaction design by its ideals: a discipline in transition. She Ji 7, 24–40. doi: 10.1016/j.sheji.2020.12.001

Crossref Full Text | Google Scholar

Horsman, G. (2018). “I couldn’t find it your honour, it mustn’t be there!”–tool errors, tool limitations and user error in digital forensics. Sci. Justice 58, 433–440. doi: 10.1016/j.scijus.2018.04.001

PubMed Abstract | Crossref Full Text | Google Scholar

Huang, Z., and Benyoucef, M. (2023). A systematic literature review of mobile application usability: addressing the design perspective. Univ. Access Inf. Soc. 22, 715–735. doi: 10.1007/s10209-022-00903-w

Crossref Full Text | Google Scholar

Hunter, L., Seagroves, S., Metevier, A. J., Kluger-Bell, B., Raschke, L., Jonsson, P., et al. (2010) Diversity and equity in the lab: preparing scientists and engineers for inclusive teaching in courses and research environments in Learning from inquiry in practice, p. 50.

Google Scholar

Innocenti, A. (2017). Virtual reality experiments in economics. J. Behav. Exp. Econ. 69, 71–77. doi: 10.1016/j.socec.2017.06.001

Crossref Full Text | Google Scholar

Jara, C. A., Candelas, F. A., Torres, F., Dormido, S., Esquembre, F., and Reinoso, O. (2009). Real-time collaboration of virtual laboratories through the internet. Comput. Educ. 52, 126–140. doi: 10.1016/j.compedu.2008.07.007

Crossref Full Text | Google Scholar

Ji, T. F., Cochran, B. R., and Zhao, Y. (2022) Demonstration of VRBubble: enhancing peripheral avatar awareness for people with visual impairments in social virtual reality in CHI Conference on Human Factors in Computing Systems Extended Abstracts, 1–6.

Google Scholar

Kainulainen, M., Helle, L., Kronqvist, P., Vincken, K., Pawelka, F., Korpinen, K., et al. (2023) Towards an integrated online learning system for microscopic pathology: two teaching examples.

Google Scholar

Kapici, H. O., Akcay, H., and de Jong, T. (2019). Using hands-on and virtual laboratories alone or together—which works better for acquiring knowledge and skills? J. Sci. Educ. Technol. 28, 231–250. doi: 10.1007/s10956-018-9762-0

Crossref Full Text | Google Scholar

Karingula, V. K. K., El-Mounayri, H., Wasfy, T., and Wasfy, H. M. (2013) Assessment of virtual physics lab (VPL) in summer course for pre-college preparation in ASEE Annual Conference and Exposition, Conference Proceedings.

Google Scholar

Kitchenham, B. (2004). Procedures for performing systematic reviews. Keele: Keele University.

Google Scholar

Kitchenham, B., Charters, S., and Keele, S. (2007) Guidelines for performing systematic literature reviews in software engineering.

Google Scholar

Kumar, M., Emory, J., and Choppella, V. (2018) Usability analysis of virtual labs in 2018 IEEE 18th International Conference on Advanced Learning Technologies (ICALT). IEEE, pp. 238–240.

Google Scholar

Kumpulainen, M., and Seppänen, M. (2022). Combining web of science and Scopus datasets in citation-based literature study. Scientometrics 127, 5613–5631. doi: 10.1007/s11192-022-04475-7

Crossref Full Text | Google Scholar

Lai, T.-L., Lin, Y. S., Chou, C. Y., and Yueh, H. P. (2022). Evaluation of an inquiry-based virtual lab for junior high school science classes. J. Educ. Comput. Res. 59, 1579–1600. doi: 10.1177/07356331211001579

Crossref Full Text | Google Scholar

Leporini, B., and Buzzi, M. (2019). “Education and STEM on the web” in Web accessibility: a foundation for research, 651–674.

Google Scholar

Li, P. (2022) Redesigning cyber security labs with immediate feedback in ASEE Annual Conference and Exposition, Conference Proceedings.

Google Scholar

Lin, M.-H., Chen, H.-C., and Liu, K.-S. (2017). A study of the effects of digital learning on learning motivation and learning outcome. Eurasia J. Math. Sci. Technol. Educ. 13, 3553–3564. doi: 10.12973/eurasia.2017.00744a

Crossref Full Text | Google Scholar

Liu, C.-C., Wen, C. T., Chang, H. Y., Chang, M. H., Lai, P. H., Fan Chiang, S. H., et al. (2022). Augmenting the effect of virtual labs with “teacher demonstration” and “student critique” instructional designs to scaffold the development of scientific literacy. Instr. Sci. 50, 303–333. doi: 10.1007/s11251-021-09571-4

PubMed Abstract | Crossref Full Text | Google Scholar

Manyilizu, M. C. (2023). Effectiveness of virtual laboratory vs. paper-based experiences to the hands-on chemistry practical in Tanzanian secondary schools. Educ. Inf. Technol. 28, 4831–4848. doi: 10.1007/s10639-022-11327-7

PubMed Abstract | Crossref Full Text | Google Scholar

Marks, B., and Thomas, J. (2022). Adoption of virtual reality technology in higher education: an evaluation of five teaching semesters in a purpose-designed laboratory. Educ. Inf. Technol. 27, 1287–1305. doi: 10.1007/s10639-021-10653-6

PubMed Abstract | Crossref Full Text | Google Scholar

Mercado, J., and Picardal, J. P. (2023). Virtual laboratory simulations in biotechnology: a systematic review. Sci. Educ. Int. 34, 52–57. doi: 10.33828/sei.v34.i1.6

Crossref Full Text | Google Scholar

Misiejuk, K., Khalil, M., and Wasson, B. (2023) Tackling the challenges with data access in learning analytics research: a case study of virtual labs.

Google Scholar

Muhamad, M., Zaman, H. B., and Ahmad, A. (2012). Virtual biology laboratory (VLab-bio): scenario-based learning approach. Procedia Soc. Behav. Sci. 69, 162–168. doi: 10.1016/j.sbspro.2012.11.395

Crossref Full Text | Google Scholar

Nedungadi, P., Ramesh, M. V., Pradeep, P., and Raman, R. (2018). “Pedagogical support for collaborative development of virtual and remote labs amrita VLCAP” in Cyber-physical laboratories in engineering and science education.

Google Scholar

Nonnemacher, J., and Sokhey, S. W. (2022). Learning by doing: using an undergraduate research lab to promote diversity and inclusion. PS 55, 413–418. doi: 10.1017/S1049096521001633

Crossref Full Text | Google Scholar

Okoli, C., and Schabram, K. (2015) A guide to conducting a systematic literature review of information systems research.

Google Scholar

Ouzzani, M., Hammady, H., Fedorowicz, Z., and Elmagarmid, A. (2016). Rayyan—a web and mobile app for systematic reviews. Syst. Rev. 5, 1–10. doi: 10.1186/s13643-016-0384-4

Crossref Full Text | Google Scholar

Page, M. J., McKenzie, J. E., Bossuyt, P. M., Boutron, I., Hoffmann, T. C., Mulrow, C. D., et al. (2021). The PRISMA 2020 statement: an updated guideline for reporting systematic reviews. Int. J. Surg. 88:105906. doi: 10.1016/j.ijsu.2021.105906

PubMed Abstract | Crossref Full Text | Google Scholar

Paladines, J., Ramírez, J., and Berrocal-Lobo, M. (2023). Integrating a dialog system with an intelligent tutoring system for a 3D virtual laboratory. Interact. Learn. Environ. 31, 4476–4489. doi: 10.1080/10494820.2021.1972012

Crossref Full Text | Google Scholar

Picovici, D., Contiu, A. V., Topa, A., and Nelson, J. (2008). Virtual lab for wireless sensor networks. Adv. Electr. Comp. Eng. 8, 37–42. doi: 10.4316/aece.2008.02007

Crossref Full Text | Google Scholar

Potkonjak, V., Gardner, M., Callaghan, V., Mattila, P., Guetl, C., Petrović, V. M., et al. (2016). Virtual laboratories for education in science, technology, and engineering: a review. Comput. Educ. 95, 309–327. doi: 10.1016/j.compedu.2016.02.002

Crossref Full Text | Google Scholar

Prentice, R. (2016). “The anatomy of a surgical simulation: the mutual articulation of bodies in and through the machine 1” in Technology and medical practice (Routledge), 79–95.

Google Scholar

Pylvänäinen, J. W., Mairinoja, L., Tuomisto, T., Solis, J., Toivola, D. M., and Kankaanpää, P. (2023) Supporting microscopy learning with Ocul-AR, a virtual and augmented reality-powered Mobile application.

Google Scholar

Raineri, D. (2001). Virtual laboratories enhance traditional undergraduate biology laboratories. Biochem. Mol. Biol. Educ. 29, 160–162. doi: 10.1111/j.1539-3429.2001.tb00107.x

Crossref Full Text | Google Scholar

Raman, R., Achuthan, K., Nair, V. K., and Nedungadi, P. (2022). Virtual laboratories-A historical review and bibliometric analysis of the past three decades. Educ. Inf. Technol. 27, 11055–11087. doi: 10.1007/s10639-022-11058-9

PubMed Abstract | Crossref Full Text | Google Scholar

Ramezanianpour, A. A., and Jovein, H. B. (2012). Influence of metakaolin as supplementary cementing material on strength and durability of concretes. Constr. Build. Mater. 30, 470–479. doi: 10.1016/j.conbuildmat.2011.12.050

Crossref Full Text | Google Scholar

Redel-Macias, M. D., Cubero-Atienza, A. J., Martinez-Valle, J. M., Pedros-Perez, G., and del Pilar Martinez-Jimenez, M. (2015). Noise and vibration risk prevention virtual web for ubiquitous training. IEEE Trans. Educ. 58, 303–308. doi: 10.1109/TE.2015.2415769

Crossref Full Text | Google Scholar

Reeves, S. M., and Crippen, K. J. (2021). Virtual laboratories in undergraduate science and engineering courses: A systematic review, 2009–2019. J. Sci. Educ. Technol. 30, 16–30. doi: 10.1007/s10956-020-09866-0

Crossref Full Text | Google Scholar

Reginald, G. (2023). Teaching and learning using virtual labs: investigating the effects on students’ self-regulation. Cogent Educ. 10:2172308. doi: 10.1080/2331186X.2023.2172308

Crossref Full Text | Google Scholar

Reitinger, J., Cech, M., and Goubej, M. (2013) Advanced input shaping filter 3D virtual laboratory in Proceedings of the 2013 International Conference on Process Control, PC 2013, pp. 528–533.

Google Scholar

Robbins, J. B. (2001). ERIC: Mission, structure, and resources. Gov. Inf. Q. 18, 5–17. doi: 10.1016/S0740-624X(00)00062-9

Crossref Full Text | Google Scholar

Robinson, J. L., Braimah Avery, V., Chun, R., Pusateri, G., and Jay, W. M. (2017). Usage of accessibility options for the iPhone and iPad in a visually impaired population. Semin. Ophthalmol. 32, 163–171. doi: 10.3109/08820538.2015.1045151

PubMed Abstract | Crossref Full Text | Google Scholar

Rossiter, J. A., and Shokouhi, Y. B. (2012) Developing virtual laboratories for introductory control in Proceedings of 2012 UKACC International Conference on Control. IEEE, pp. 1025–1030.

Google Scholar

Sahin, B., and Gul, L. F. (2014) Understanding design practice, a case of Turkey how really we employ digital tools for everyday designing?

Google Scholar

Sánchez, C., Gómez-Estern, F., and de la Peña, D. M. (2012). A virtual lab with automatic assessment for nonlinear controller design exercises. IFAC Proc. Vol. 45, 172–176. doi: 10.3182/20120619-3-RU-2024.00080

Crossref Full Text | Google Scholar

Sapriati, A., Suhandoko, A. D. J., Yundayani, A., Karim, R. A., Kusmawan, U., Mohd Adnan, A. H., et al. (2023). The effect of virtual laboratories on improving students’ SRL: an umbrella systematic review. Educ. Sci. 13:222. doi: 10.3390/educsci13030222

Crossref Full Text | Google Scholar

Sasongko, W. D., and Widiastuti, I. (2019). “Virtual lab for vocational education in Indonesia: a review of the literature” in AIP Conference Proceedings (AIP Publishing). doi: 10.1063/1.5139845

Crossref Full Text | Google Scholar

Savaş, K., and Erdal, H. (2010). Automatic control simulation environment system (ACSES) designed as a virtual tool for control education. Procedia. Soc. Behav. Sci. 2, 5233–5237. doi: 10.1016/j.sbspro.2010.03.851

Crossref Full Text | Google Scholar

Shen, Y., Seevinck, J., and Baydogan, E. (2006). Realistic irrigation visualization in a surgical wound debridement simulator. Stud. Health Technol. Inform. 119, 512–514

PubMed Abstract | Google Scholar

Spanias, A., Urban, S., Constantinou, A., Tampi, M., Clausen, A., Zhang, X., et al. (2000) Development and evaluation of a web-based signal and speech processing laboratory for distance learning in ICASSP, IEEE International Conference on Acoustics, Speech and Signal Processing - Proceedings, pp. 3534–3537.

Google Scholar

Subramanian, R., and Marsic, I. (2001) ViBE: virtual biology experiments in Proceedings of the 10th international conference on World Wide Web, 316–325.

Google Scholar

Sypsas, A., and Kalles, D. (2018) Virtual laboratories in biology, biotechnology and chemistry education: A literature review in 22nd Pan-Hellenic conference on informatics, PCI 2018, Hellenic Open University, Patras, 70–75.

Google Scholar

Tatli, Z., and Alipasa, A. (2012). Virtual chemistry laboratory: effect of constructivist learning environment. Turk. Online J. Dist. Educ. 13, 183–199.

Google Scholar

Tatli, Z., and Ayas, A. (2010). Virtual laboratory applications in chemistry education. Procedia Soc. Behav. Sci. 9, 938–942. doi: 10.1016/j.sbspro.2010.12.263

Crossref Full Text | Google Scholar

Tejedor, J. A. G., Martínez, G. M., and Vidaurre, C. B. (2008). An online virtual laboratory of electricity. Int. J. Dist. Educ. Technol. 6, 21–34. doi: 10.4018/jdet.2008040102

Crossref Full Text | Google Scholar

Thompson, S. E., Bourke, S. A., Callow, J. N., and Hipsey, M. R. (2022). Prioritizing engagement of a diverse student cohort in online hydrology learning at the University of Western Australia. Front. Educ. 7:907801. doi: 10.3389/feduc.2022.907801

Crossref Full Text | Google Scholar

Thoms, L.-J., and Girwidz, R. (2017) Experiments on optical spectrometry in the VirtualRemoteLab in 2017 4th Experiment@ International Conference (exp. at’17), IEEE, 147–148.

Google Scholar

Trnka, P., Vrána, S., and Šulc, B. (2016). Comparison of various technologies used in a virtual laboratory. IFAC-PapersOnLine 49, 144–149. doi: 10.1016/j.ifacol.2016.07.168

Crossref Full Text | Google Scholar

Turkan, Y., Radkowski, R., Karabulut-Ilgu, A., Behzadan, A. H., and Chen, A. (2017). Mobile augmented reality for teaching structural analysis. Adv. Eng. Inform. 34, 90–100. doi: 10.1016/j.aei.2017.09.005

Crossref Full Text | Google Scholar

Udin, W. N., and Ramli, M. (2020). Virtual laboratory for enhancing students’ understanding on abstract biology concepts and laboratory skills: a systematic review. J. Phys. Conf. Series 1521:042025. doi: 10.1088/1742-6596/1521/4/042025

Crossref Full Text | Google Scholar

Uribe, M. D. R., Magana, A. J., Bahk, J. H., and Shakouri, A. (2016). Computational simulations as virtual laboratories for online engineering education: a case study in the field of thermoelectricity. Comput. Appl. Eng. Educ. 24, 428–442. doi: 10.1002/cae.21721

Crossref Full Text | Google Scholar

Vary, J. P. (2000) Report of the expert meeting on virtual laboratories.

Google Scholar

Van den Beemt, A., Groothuijsen, S., Ozkan, L., and Hendrix, W. (2023). Remote labs in higher engineering education: engaging students with active learning pedagogy. Journal of Computing in Higher Education. 35, 320–340.

Google Scholar

VijayKumar, V., Kansari, N., Groves, J. F., and Bean, J. C. (2002) The creation of web-based interactive virtual experiments on microelectronics for early engineering students in Proceedings - Frontiers in Education Conference.

Google Scholar

Wahyuman, R. E., Rizkiansyah, M., Ariestyani, A., Matondang, R. J. A., and Prawira, I. (2021) Impact of virtual reality laboratory on student learning behavior in 2021 International Conference on Information Management and Technology (ICIMTech), pp. 773–778.

Google Scholar

Watters, J., Hill, A., Weinrich, M., Supalo, C., and Jiang, F. (2020) An artificial intelligence tool for accessible science education in IMCIC 2020 - 11th International Multi-Conference on Complexity, Informatics and Cybernetics, Proceedings, 147–150.

Google Scholar

Webster, J., and Watson, R. T. (2002). Analyzing the past to prepare for the future: writing a literature review. MIS Q. 26, xiii–xxiii.

Google Scholar

Wray, T. B., Kahler, C. W., Simpanen, E. M., and Operario, D. (2019). User-centered, interaction design research approaches to inform the development of health risk behavior intervention technologies. Internet Interv. 15, 1–9. doi: 10.1016/j.invent.2018.10.002

PubMed Abstract | Crossref Full Text | Google Scholar

Yalcin, N. A. A., and Vatansever, F. (2016). A web-based virtual power electronics laboratory. Comput. Appl. Eng. Educ. 24, 71–78. doi: 10.1002/cae.21673

Crossref Full Text | Google Scholar

Yuan, X. F., and Teng, J. G. (2002). Interactive web-based package for computer-aided learning of structural behavior. Comput. Appl. Eng. Educ. 10, 121–136. doi: 10.1002/cae.10020

Crossref Full Text | Google Scholar

Zare Bidaki, M. (2018). Application of virtual reality simulators and virtual labs in medical education. Interdiscipl. J. Virt. Learn. Med. Sci. 9. doi: 10.5812/ijvlms.66284

Crossref Full Text | Google Scholar

Zhang, X., Al-Mekhled, D., and Choate, J. (2021). Are virtual physiology laboratories effective for student learning? A systematic review. Adv. Physiol. Educ. 45, 467–480. doi: 10.1152/advan.00016.2021

PubMed Abstract | Crossref Full Text | Google Scholar

Keywords: virtual laboratory, accessible educational environment, inclusive technologies, online learning, online experiment

Citation: Deriba FG, Saqr M and Tukiainen M (2024) Assessment of accessibility in virtual laboratories: a systematic review. Front. Educ. 9:1351711. doi: 10.3389/feduc.2024.1351711

Received: 22 December 2023; Accepted: 20 May 2024;
Published: 20 June 2024.

Edited by:

Leman Figen Gul, Istanbul Technical University, Türkiye

Reviewed by:

Wahyu Widada, University of Bengkulu, Indonesia
Dewi Herawaty, University of Bengkulu, Indonesia
Julia Choate, Monash University, Australia

Copyright © 2024 Deriba, Saqr and Tukiainen. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

*Correspondence: Fitsum G. Deriba, ZmdpemFjaGVAdWVmLmZp

Disclaimer: All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article or claim that may be made by its manufacturer is not guaranteed or endorsed by the publisher.