- 1Educational Center, Department of Biomedical Sciences, University Medical Centre Utrecht, Utrecht, Netherlands
- 2Center for Research and Development of Health Professions Education, University Medical Center Utrecht, Utrecht, Netherlands
Professionals are increasingly confronted with complex problems that require generic skills. These generic skills are important for a variety of domains and contexts. As the evaluation of such skills can be difficult, this paper reported on the development of the Generic Skills Learning Systematic. With this systematic, university students’ self-perceived generic skills learning after following a complex problem-solving course can be evaluated. The systematic was developed by analysing 43 learner reports in an iterative process, in which students described what they had learned during the course. A formative audit was performed to increase and ensure quality. The Generic Skills Learning Systematic consists of two steps. Step one is identifying students’ learning, where learning is viewed as any described change in generic skills. The changes are called learning categories for which five were distinguished: value, understanding, self-level, intention, and progress. Three checks are described to help with identifying the reported changes in students. In step two, generic skills are identified, using an adapted version of an existing categorisation resulting in 36 generic skills in total. Next, the application of the systematic is described and frequency distributions are given to provide insight into the usability of the systematic for educators. The results show that students report learning in a variety of learning categories and generic skills, indicating the broadness of learning in such a complex problem-solving course. In conclusion, educators are advised to rethink the choices made in education regarding the instruction and assessment of students. Broadening our scope of learning and paying attention to the different learning categories can aid the development of the professionals of the future.
Introduction
Education has the task of developing students to be valuable for a changing society and labour market, as well as to fulfil societal needs (Ramaley, 2014). The industry often asks for professionals with generic skills that are applicable in different working environments and in society in general (Chamorro-Premuzic et al., 2010; Regueiro et al., 2021). A broader skill set appears to be necessary for a technological, competitive, and global 21st century workforce (Nealy, 2005) and to be able to deal with wicked problems (Berg et al., 2021). Examples of generic skills are the ability to collaborate, communicate appropriately, plan and prioritise, and reflect (Andrews and Higson, 2008; Chamorro-Premuzic et al., 2010; Pache and Chowdhury, 2012).
The result is an increasing need to train and support students in the development of generic skills (Ramaley, 2014), by designing learning environments with suitable learning opportunities (Crebert et al., 2004). A common way to develop the generic skills of university students is to use complex problem-solving learning environments (Ward et al., 2018). Utrecht university has therefore developed generic skills learning opportunities in the context of complex problem-solving. However, so far, little is known about how students develop generic skills in these educational environments, both in relation to which generic skills students develop and in what way they develop them. The current paper, therefore, aims to answer the following research question: How can a complex problem-solving course provide insight in university students’ generic skills learning? To answer the research question, a systematic is developed based on student self-reports and an overview of students’ learning is provided to give an example of the application of the systematic. The systematic can be used by students and teachers to provide directions for evaluating generic skills learning.
Defining and selecting generic skills for education
The term skill refers to the ability to use domain-specific knowledge for performing an action or task (Matteson et al., 2016). These tasks and actions can be very specific for one domain and are therefore often called domain-specific skills. Examples are laboratory skills for researching cells or didactic skills to motivate a group of university students. When a skill is usable and beneficial in a variety of domains, situations, and contexts, researchers often use terms like professional skills, transferable skills, soft skills, employability skills, personal skills, and generic skills (Bennett et al., 1999; Raybould and Sheedy, 2005; Shuman et al., 2005; Andrews and Higson, 2008; Matteson et al., 2016; Jääskelä et al., 2018; Touloumakos, 2020; Regueiro et al., 2021; Santos Rego et al., 2021). What these skills exactly entail or refer to is not agreed upon in the literature (Jääskelä et al., 2018). For example, employability skills refer to the importance of these skills in being able to get a job (e.g., Raybould and Sheedy, 2005), whereas professional skills focus on the usefulness of these skills in a professional context instead of on the person’s chance to be employed (e.g., Shuman et al., 2005). Transferable skills emphasise skills that are transferable to other contexts (e.g., Andrews and Higson, 2008), although it has been argued that it is not guaranteed that learning these skills in one context will transfer them automatically to other contexts (Bennett et al., 1999). In this paper, we will use the term generic skills to refer to the skills related to the broader domain of higher education. The term generic is used to emphasise that these skills are not linked to a specific context (Bennett et al., 1999) and are useful for different situations and activities, also outside of the university (Gilbert et al., 2004). Therefore, they are thought of as more generalisable as they are less context dependent.
Although generic skills are viewerd as important for all graduates, it remains a discussion which generic skills belong to the different terms used in the literature, and which generic skills students need to develop in higher education (Gilbert et al., 2004). Jääskelä et al. (2018) argued for instance that the necessary skills for a constantly changing society cannot be fixed for a longer period. In addition, having a heterogenous group of students with different backgrounds, skill levels, and future career interests, selecting relevant generic skills beforehand will be more difficult (Bridges et al., 2011). Crebert et al. (2004) emphasised the importance of promising students to have opportunities to develop their generic skills, instead of promising to have a certain set of generic skills, as this will be impossible to fulfil.
Complex problem-solving as a generic skills learning environment
Complex and real-world problem-solving activities provide students with opportunities to develop their generic skills (Ward et al., 2018). Studies on complex problems originated in the 1970s from Rittel and Webber’s (1973) research on wicked problems. More recently, the consensus of what contains a wicked problem is the complexity of the problem, uncertainty, open-ended approach, and diverging perspectives and values of stakeholders (Veltman et al., 2019). This complexity is characterised by having many variables that are interconnected and changing over time, and the influence of these variables on the problem is neither clear nor straightforward (Funke, 2010). The active engagement and participation of students in the complex problem-solving learning processes have been found to positively influence the development of skills (Mason et al., 2009). To work on complex problems successfully, an open-ended, critical, and creative inquiry (Rule, 2006), as well as an interdisciplinary approach is needed (Bridges et al., 2011).
Next to having a nurturing learning environment for generic skills development, generic skills also need to be explicitly instructed and practised. Predictors for generic skills development are interaction and collaboration, theory and practise integration, feedback, and a positive learning atmosphere (Virtanen and Tynjälä, 2019). The development of generic skills should ideally take place in disciplinary domains (Jones, 2009), as separately teaching generic skills have been found ineffective for university students (Hattie et al., 1996). The importance of combining curriculum content and real-world experiences was also emphasised by Australian graduates in a study by Crebert et al. (2004). By choosing complex problems that include diverging perspectives, students can apply their own context and knowledge in contributing to the solution.
Defining generic skills learning: A broad perspective
Although in traditional terms, learning is often perceived by both educators and students as a performance outcome for instance on a test or assignment (Mylopoulos et al., 2016), in the current paper, learning is defined as changes in students. Developing generic skills is a process that takes time. A person needs the time to practise applying their knowledge in tasks and situations to become skilled (Matteson et al., 2016). For higher education programmes, which are often comprised of separate courses, focusing only on students’ generic skills ability at the end of an educational activity might, therefore, lead to disappointing results in terms of skill learning. De Grez et al. (2009) for instance found no effect of their feedback condition in presentation training and suggested that this was due to a lack of time. They argued that time is necessary for internalising standards and practicing the presenting skills in combination with the standards. Next to time, generic skills development has been argued to be a process of complex interactions between students’ perceptions, knowledge, and behaviour. This is taken into account in the model of Dall’alba and Sandberg (2006), where a combination of experience with the skill and the understanding of the skill represents individual learning trajectories. The interactive nature of students’ skills has implications for how to evaluate such skills. This at least means that behaviour and cognition should be included in the evaluation. An example of a theory that combines cognition (including perceptions) and behaviour is the theory of planned behaviour. According to this empirically widely proven theory, intentions towards behaviour and the underlying beliefs (i.e., cognitions) are important predictors of past and future behaviour (Fishbein and Ajzen, 2010). Intentions reflect how ready a person is to act and perform a certain behaviour. Bakkenes et al. (2010) have incorporated this in their view on teacher development. They studied the student–teacher relationship and defined learning broader than just progress. They included several affective, cognitive, and behavioural changes in a person in their definition of learning, making a distinction between changes in knowledge and beliefs, intentions for practice, emotions, and practise. For instance, having increased knowledge about how to be a good presenter or positive beliefs about the importance of presenting are considered relevant learning outcomes. Chan et al. (2017) developed an instrument for engineering students. The instrument was validated to get insight in students’ perceptions of the value of a set of generic skills next to their current competency level. All in all, taking changes into account when evaluating students’ generic skills learning in complex problem-solving environments might be more realistic compared to only focusing on improving the skills.
The current study
As the importance of generic skills for graduates becomes apparent, universities develop learning environments in which students can develop these skills. The evaluation of students’ learning experiences however, remains difficult, due to the complexity and variety of generic skills learning. This paper aims to provide insight into students’ generic skills learning during adaptive complex problem-solving by answering the following research question: How can a complex problem-solving course provide insight in university students’ generic skills learning? In order to answer these research questions, we aimed to develop a systematic that students and teachers could use to provide directions for evaluating generic skills learning. Next, the data of generic skills learning in a complex problem-solving course is shown as an example of how to apply the systematic and the possible results the systematic could yield.
Materials and methods
Sample
For the current study, an elective course developed by the Faculty of Medicine within Utrecht University in Netherlands with a focus on complex problem-solving was purposefully selected. This course (i.e., co-create: life’s professional challenges) was selected because it included a complex problem (Rule, 2006), an interdisciplinary approach (Bridges et al., 2011), feedback opportunities (Mylopoulos et al., 2016), a positive learning atmosphere, and integration of theory and practice (Virtanen and Tynjälä, 2019). The course furthermore provided students with opportunities to develop generic skills, without them being narrowed down by educators beforehand.
A complex, real-world problem was posed by the city of Utrecht. Students were stimulated to develop a solution in interdisciplinary groups during two consecutive, full-time workweeks. At the start of the course, students were instructed to form their own groups of 4–5 students, composed of students with different educational backgrounds and different educational levels. Each year, a different problem was posed, which were people in the city of Utrecht too often used motorised vehicles for transportation, people in the city of Utrecht report a high level of loneliness, and the mental wellbeing of people in Utrecht is low. For the students, the posed problem was defined in the same broad manner as described here. This required the interdisciplinary groups to narrow down the problem during the first days of the course. As a result, each group formulated different workable problems, based on the perspectives and interests present in the group. After formulating a workable problem, the groups researched their target audiences and worked iteratively on their solution. The format of the solution was free, indicating that the groups could choose whether their solution was for instance a tangible product, an idea, or a piece of advice. Next to the group process, students worked individually on their professional development. For this purpose, students were instructed to formulate personal learning objectives at the start of the course and to reflect during and after the course. To guide both team and individual processes, students participated in lectures, workshops, and coaching sessions. Figure 1 shows the setup of the course. During the pitching event on the fourth day and the solution presentation on the 10th day, students participated in real-world situations where skills could be practised. At these moments, and during other coaching moments throughout the course, students received feedback on the process and product, both as a group and individually. The course was given in person once a year and three times in total, from 2017 to 2019.
A total of 51 participants took part in the course over the 3 years. For the development and application of the systematic convenience, sampling was used by including all students in the sample. Of the 51 participants, two participants dropped out before the end of the course due to personal reasons. Six participants did not hand in a report (see instrumentation) because they did not need the learning credits. They were recently graduated students and master’s students who did the course next to their fixed curriculum. This resulted in data from 43 participants, with a gender division of 30 women and 13 men. The participants were following a Bachelor (six participants) or Master (31 participants) programme or were recently graduated from (six participants) six of the seven faculties of Utrecht University: Medicine (20 participants), Science (nine participants), Geosciences (six participants), Social and Behavioural Sciences (three participants), Law Economics and Governance (four participants), and Humanities (one participant). The sample is very heterogenous, as successful complex problem-solving requires people with different backgrounds and experiences to collaborate. Looking at this diverse group of (recently graduated) students might limit the possibility to generalise the results to a specific field, but it is a more authentic group regarding the context of complex problem-solving.
Instrumentation and data collection
To get insight in students’ learning experiences and how generic skills were developed, students’ self-perceptions from learner reports were used. Most of the changes described by Bakkenes et al. (2010) include changes in beliefs, attitudes, or thoughts, which are covert and occur in people’s minds. Students’ perceptions of their own learning can therefore be a useful source of information. The learner report is a specific form of self-report in which students give their answer to the central question “What have you learned?” (van Kesteren, 1989). Students select the learning outcomes they want to describe, which can lead to a great variety of described generic skills learning in the context of complex problem-solving. They can, furthermore, include their thoughts, opinions, and ideas without being challenged, providing insight into experiences besides improving on a skill. As self-reports are relatively often implemented in non-traditional forms of education and no direct interaction between researcher and participants is necessary (Weber, 1990), we used the information from learner reports in the current study.
For the learner reports in the example course, students received instruction and feedback on formulating personal learning objectives and on reflection. For the personal learning objectives, students were instructed to choose a skill that they wanted to develop, to be concrete (e.g., by describing desired behaviours, thoughts, and results), and to make the objective feasible to reach within 2 weeks. For the reflection, students were shown and instructed to follow a reflection cycle and to reflect on multiple levels. For instance, next to describing the environment and behaviours, students were instructed to reflect on their own capabilities, values, beliefs, and identity. To guide this process, students were stimulated each day to recall their learning experiences and past behaviour using a daily journal to aid their reflection processes (Fishbein and Ajzen, 2010). Two entire in-person day workshops on personal development were given in which students were guided by a coach to get insight into their own development. Based on students’ needs, possibilities for formative feedback were given in the form of written feedback and individual conversations with a coach. Feedback included prompts to make the personal learning objectives more concrete and to reflect more deeply, as opposed to main descriptions of the situation. The daily journal entries were rewritten in a final learner report. This learner report was limited to five pages of text or 15 min of video or audio. All of the students used digital text. The time investment for the learner report assignment was approximately 12 h for the student. The learner reports were part of the assessment of the course, although the topics in the report and the self-perceived learning of students were not assessed. A rubric, which was shared with students before the start of the course, was used for grading. The rubric consisted of three criteria: whether students (1) followed a reflection cycle, (2) described different angles (their environment, behaviour, norms, and values), and (3) were critical. Each criteria had three levels: insufficient, sufficient, and exemplary. Based on these criteria, the learner report could increase or decrease the individual grade of a student (20% of the final grade) with 0.5 points. The criteria and standards were not incorporated into the development of the systematic, or the application, in the current paper.
The final dataset consisted of 43 written learner reports, in which students reflected on their learning processes during the course.
Data analysis
The written learner reports were analysed to develop a systematic for evaluating students’ generic skills learning perceptions and at the same time to provide an example with the results generated from applying the systematic. The results section will therefore consist of two parts. The first part describes the structural search for change and generic skills in students’ learner reports. Categories were created using content analysis (Weber, 1990) and resulted in the developed Generic Skills Learning Systematic. The second part describes the application of the systematic and the subsequent results. Frequencies of students’ generic skills learning are presented for the learning categorisation, the generic skills, and the combination of both.
Audit procedure
To evaluate and improve the quality of this research study, a formative audit was performed (Akkerman et al., 2008). The auditor was an external research assistant with a background in educational sciences, with qualitative research experience in the domain of reflection activities. The auditor critically assessed and provided the authors with feedback on the research questions, methods, coding, data analysis, and data interpretation using two criteria: logicalness and transparency. To do this, the auditor studied documents on the systematics’ development and coding procedure and had access to the learner reports to provide feedback on how the coding was performed. The audit was marked as completed after the authors responded satisfactorily on improvement comments from the auditor. This mainly included extending and clarifying the descriptions and wording in the documents. The auditor, who was a native English speaker, checked the interpretations to reduce this bias where possible, as most students in the course were not native English speakers. In the audit summary written by the auditor, it was described which changes were done and stated that the introduction and method were logical and transparent. The audit summary is added as Supplementary material. The complete report includes all research decisions and interactions between the auditor and auditee and is available on request.
Ethical approval
The authors performed the research according to the rules of the Ethical Review Board of the Dutch Association for Medical Education (NVMO). The following criteria must and have been met: data were gathered from regular educational activities, the researchers had access to the data due to their educational role as a teacher or the data were completely anonymous for the researcher, and reports were irreducible to individuals.
Results
The generic skills learning systematic: Development
The Generic Skills Learning Systematic is based on an existing generic skills categorisation and an inductive categorisation for learning. After familiarisation with the data, phrases were selected in which generic skills were mentioned in combination with a change. A phrase includes all information necessary to code it in regard to both the learning category and generic skill. A phrase can, therefore, consist of one sentence, part of a sentence, or multiple sentences.
The final version of the developed systematic consisted of two steps and three checks which together determined whether a written phrase in the learner report indicated learning a generic skill (see Figure 2). The first step was the identification of a student’s learning by categorising the described change. During the selection of phrases, it became apparent, due to a large amount of text in students’ learning reports, that selecting all instances of change might be difficult. This prompted the development of three checks, aiming to help with the identification of a change. These checks included determining whether: (A) the student used a change indication word, (B) the described change was explicitly related to the course or educational programme, and (C) the described change included the student’s own perspective. The second step was to code the generic skill on which the student reported change.
Step 1: Identifying student’s learning by their described change
For learning, the inductive approach was based on viewing learning as indicated by a change (Bakkenes et al., 2010) and included therefore all found changes. An example including a described change is “I listened better to my team member.” An example that neglects to indicate a change is “I listened to my team member,” as the student only describes an action. The changes reported by students in their learner reports were categorised into the following five learning categories: (1) value is a positive change in the student’s belief about the importance of the skill, (2) understanding is a change in a student’s knowledge about how a skill works, (3) self-level is a change in a student’s self-knowledge regarding their mastery of the skill, (4) intention is a student’s willingness to work on the skill later, and (5) progress is a positive change in a student’s mastery of the skill. Table 1 provides an overview of the definitions of the learning categories and example phrases from the learner reports of students.
Check A focused on identifying words indicating a change. These words were used both to select relevant phrases for coding as well as to inspect whether coded phrases included a change. All reports were read and checked for the selected change indication words, which resulted in a larger number of coded learner experiences. We found 34 words from the learner reports and inductively categorised them into four groups, see Table 2. Based on the data, the indication word categories were linked to the learning categories in Table 1. The category describing the urge to change included words like “have to” and “must” and was linked to a student’s self-level. The category describing the intention to change later included words like “will” and “future” and was linked to a student’s intention. The category described receiving new information with words such as “noticed,” “found out,” and “realised” was related to the value, understanding, and self-level categories. The overarching category can be linked to all the learning categories, as it includes general words such as “learned,” “better,” and “improved”. As several change indication words can be applied to multiple learning categories, the context in which the change indication word was used should be leading. For example, the phrases “I learned how to present” and “I learned that I’m not a good presenter” have the same indication word (i.e., learned), but the first is understanding and the second is self-level.
Check B and C resulted in stricter coding. With Check B, it was examined whether the student attributed the learning to the course. A self-level example from a learner report which was related to the course is “the Belbin test shows that I could be a coordinator, yet this was not my role in the group. I realised that I easily could fill in the supportive role, especially when someone else is more dominant in a group and prefers to lead.,” as the Belbin test was part of the course. A progress example is “I had discussed my struggle on Monday with my teammates, so it was easier for me and for them to address anything related to it whenever that popped up.,” as this describes a change in behaviour. An example which is not attributed to the course is the statement “from that conversation in the train, I became aware of the importance of listening.,” as this statement could indicate an already existing belief. The context should therefore include a reference to the followed education. The last check, Check C, referred to whether learning reflects the student’s own learning (e.g., “I collaborated better”). Learning from the team perspective (e.g., “we collaborated better”) or the perspectives of others (e.g., “a team member said I collaborated better”) can also be included, for instance, in educational activities with a strong emphasis on collaboration. An example that was not coded was “they collaborated better,” as this excluded the learning of the student writing the learner report.
Step 2: Identifying the generic skill on which the student changed
To identify the specific generic skill a student learned, an adapted version of the categorisation by Dunne in Bennett et al. (1999) was used. They developed a categorisation of generic skills based on the existing literature and on experiences of three university department heads, staff who are teaching skill courses, and the observations of students participating in skill courses. It includes four main clusters of generic skills that are not linked to a specific domain, namely, the management of (1) self, (2) others, (3) information, and (4) tasks, with a total of 36 skills (see Table 3 for the list of skills and their definitions). The skill definitions were adapted to the context of complex problem-solving and skills that weren’t distinguishable in our data were merged. An example is the skill execution, which was originally composed of multiple skills, but couldn’t be located in our data. We furthermore added the code general for learning categories and generic skills when the information in the learner report was not clear or incomplete (e.g., “I learned a lot” was coded as general-general).
Table 3. Generic skills categorisation with four generic skills categories adapted from Dunne (see Bennett et al., 1999), with the names of the skills on the left and skill definitions on the right.
Applying the generic skills learning systematic
First, the phrases in the learner reports indicating a change in generic skills were highlighted and transferred to rows in Microsoft Excel. Then, for each phrase, the indication word(s) were selected and placed in the column next to the phrase. Next, the learning category and generic skill were coded. When a phrase referred to multiple learning categories and/or generic skills, it was placed on a separate row for each code, resulting in one phrase being on several rows. Coding was done by the first author with support from the second and last authors. In six rounds of coding, learner reports were simultaneously coded. On average, 12 phrases were coded each round of which two to three needed to be discussed between researchers. The discussed differences resulted in correcting the coding scheme and the coding itself to be more precise. We chose based on these discussions to be strict with coding, where ambiguities with too much interpretation were not coded. This might have resulted in missed information, for instance, when students were not using explicit wording. An example from a student is “I realised I might have answered too quick for my mind to keep up,” which could refer to multiple underlying skills (e.g., coping with stress, listening, communicating, reflecting). We furthermore chose to include the “we” statements in our case course, as collaboration was such a large part of the course. Another aspect resolved in discussion was that a skill working as a mechanism for improving a skill was not coded. For example, “I listened to my team and this helped me to improve my leadership skills” indicates an improvement in leadership skills not in listening. When agreement proved difficult, first, similar examples from other learner reports were reviewed on whether they could provide a clearer picture. If these examples did not provide new insights, the three authors voted, and these decisions were written down in a “comments” column in the Excel file. After all discussions were resolved, all learner reports were coded again and checked for completeness and consistency in coding.
Due to the explorative nature, descriptive analyses in three steps were used. First, the frequencies of the described changes reflected in the learning categories were calculated. In the second step, we calculated the frequencies of generic skills. For this, we only included the unique changes. That is, in some learner reports, the student described a specific change on a skill multiple times, which was then only counted once. Third and last, we described the relationship between learning categories and generic skills. We included absolute numbers and percentages in Supplementary Table A. In this paper, we only reported on the absolute numbers for clarity.
From the 43 learner reports, a total of 481 self-reported changes in generic skills were coded. Of these 481 changes, 115 changes appeared several times in the same report and were removed before analyses. This resulted in 364 unique coded phrases, of which 86 (25%) received the code “general,” indicating that there was not enough information to code the phrase as a generic skill. The final dataset consisted of 278 coded phrases divided over 43 learner reports (M = 6.4, SD = 3.6, min-max = 1–16). The number of phrases that received a certain code was positively correlated with the number of words in a learner report (r = 0.437, p = 0.004). Table 4, last row, shows that for the learning categories, learning about the self-level was described in 40 of the 43 learner reports. Understanding and progress were described quite often as well, occurring in 30 and 29 learner reports, respectively. Learning about the value of a generic skill was mentioned the least. In 12 instances, occurring in eight reports, there was not sufficient information to select the right learning category.
Table 4. Total learning categories (last row), total generic skills (last column), and the relation between learning categories and generic skills.
The last column of Table 4 shows the generic skills categories frequencies. Skills in the management of others category were reported the most by our students, with 39 mentioning them. Within this category, the skill collaborating had the highest prevalence, being described in 36 reports, followed by giving feedback (17), respecting (11), and leadership (10) (see Supplementary Table A). The second highest frequency was the management of self category, in which 29 students described a change. Although coping with stress in that category was clearly mentioned the most (by 12 students), the other skills were mentioned by one to seven students. Skills in the management of information category were described by 25 out of the 43 students. Here, communicating was described the most (by 17 students) and the other skills between one and six students. The management of tasks skills were mentioned the least, 15 students included these in their reports. Of these, executing and prioritising were mentioned more often (by seven and five students, respectively) compared to the other skills in this category. Zooming in on the skills which are mentioned less often over all the generic skills categories shows that eight skills were mentioned by only one student and three skills by two students in total. Hence, these 14 coded phrases were retrieved from 11 individual learner reports, indicating a large variation in reported skills. The skills analysing, media, and key features were not mentioned by any of the students in their learner reports. Zooming out to the generic skills categories, eight students reported learning skills on all four categories in their learner report, 13 students on three categories, and 16 students on two categories.
The learning categories frequencies are also showed against the generic skill categories in Table 4. Supplementary Table A provides a more detailed description by including the specific generic skills within the generic skills categories as well. The relative high frequency of learning on the self-level matched the learning of the generic skills, where for most skills, the self-level learning category had the highest prevalence. However, for some skills, this was different. For instance, for respecting (management of others) and coping with stress (management of self), students reported more often to have progressed instead of having more insight into their self-level. For reflecting (management of self), this is the case for formulating intentions. For the skill creativity (management of information), both progress and intention were mentioned twice, whereas self-level was mentioned only once. Other skills showed a more even distribution among the learning categories, for instance, executing (management of tasks), selecting (management of information), and options (management of tasks). Students did not learn mostly on one learning category for that skill. Another notable finding was that, for communicating (management of information) and collaborating (management of others), a relatively large number of students reported understanding the skill better when compared to the increased understanding of the other skills in students.
Discussion
Gaining insight into students’ generic skills learning can be difficult. This is due to the difficulties in conceptualisation and large variety of possible relevant skills (Jääskelä et al., 2018), as well as the time it takes to master skills (De Grez et al., 2009; Matteson et al., 2016). The purpose of the current study was to provide educators with a systematic to evaluate student’ self-perceived generic skills learning after complex problem-solving, which takes into account the process of skills development. A two-step, Generic Skills Learning Systematic was developed to provide a systematic approach to evaluate and document students’ self-reported generic skills learning using written learner reports. The first step was to identify the changes students reported. Next to identifying three checks (i.e., indicating change words, relation to education, and student’s perspective), we deduced five types of change. We referred to these in this paper as the following learning categories: value, understanding, self-level, intention, and progress. All these categories have been theorised in the literature, separately or together, to influence learning. Metacognitive knowledge theories suggest that when students become more aware of their own thinking and more knowledgeable about cognition, they will be better able to act on this awareness and improve learning (Pintrich, 2002). The three types of metacognitive knowledge can be linked to the learning categories. Strategic knowledge refers to the what and how of strategies and could be reflected in understanding. Cognitive task knowledge refers to when and why these strategies should be used and could be reflected in value and intention. The last, self-knowledge refers to knowing one’s strengths and weaknesses, and could be reflected in self-level and progress.
Looking at the separate learning categories, Chan et al. (2017) for instance included, next to progress, degree students’ perceptions of the importance for each skill for their future career, ranging from 1 to 5 on a Likert scale. They argued that students’ perceptions of the value of a skill motivated them to learn this skill. This mechanism might be closely related to intrinsic value. Intrinsic value refers to being motivated to perform a task due to personal benefits and interests as compared to external incentives (Zimmerman, 2008). Zimmerman reasoned that a higher intrinsic value resulted in a higher motivation to complete an activity, which could therefore also be beneficial for improving performance. For the understanding learning category, literature has emphasised the importance of explicit instruction in a skill for skill development (Jones, 2009). The course included explicit skill workshops, which might explain finding a change in students’ understanding in the learning reports. The other learning categories, which are self-level, intentions, and progress, are closely related to reflection activities. During reflection, students need to actively examine responses and beliefs to unusual situations and create new understandings (Rogers, 2001). This increased understanding about one’s self, based on specific situations or learning experiences (Sandars, 2009) can be related to the increased self-level of students. Professionals who can adapt to new situations are more aware of what they know and don’t know as compared to those who are less able to adapt (Ward et al., 2018). This ability to adapt and to continue learning is necessary for all professionals (Sandars, 2009). In addition, the step of planning to act in reflection (Rogers, 2001), can be related to intentions, as intentions are important predictors of future behaviour, as they showed readiness to act (Fishbein and Ajzen, 2010). Action should then be taken based on these intentions, and might result in better performances.
The results from applying the Generic Skills Learning Systematic showed a great variety in students’ learning experiences. Most of the learning fell within the self-level learning category, indicating that students mostly gained insight in their current skill level. This relative high prevalence might be due to the short time span and focus of the course. A whole track was dedicated to personal development and reflecting on one’s own strengths and weaknesses (Figure 1). Less time was available for testing out multiple new behaviours, although students did report to have progressed during the course. During the skills workshops in the course, attention was given to the value and understanding of certain skills, but not all possible skills could be explicitly instructed in a formal education setting. A skill which was instructed, could be practiced during the course, and was the focus of feedback during the course, was collaboration. Results show that 14 students (33%) reported to understand the skill better. This could demonstrate the importance of combining practice with instruction and feedback (Tynjälä, 2008). Concerning the four main generic skills categories (Bennett et al., 1999), students described their change most often in the management of others category, followed by the management of self and information. Of all the 36 specific generic skills, the skills collaborating, communicating, coping with stress, and giving feedback were reportedly learned the most. An explanation for this finding is the design of the course, where the focus was on solving complex problems using interdisciplinary teamwork and incorporating multiple (conflicting) perspectives in a short period of time. Kohn Rådberg et al. (2020) investigated student learning in a challenge-based learning environment, in which students worked in a multidisciplinary fashion with a variety of stakeholders in society, to identify and solve complex sustainability problems. They also found that students developed their collaboration and communication skills in this environment. Coping with stress appears to be more important in our course as compared to that of Kohn Rådberg et al. (2020). This might be due to the real-world societal partner in our course, who not only brought the problem into the course but also chose a “winning” solution to be developed further and implemented in society. The design of the course may also explain why students focused less on the skills falling in the management of tasks category, as less workshops and exercises focused on this aspect. Differences between students and their learning focus are in addition reflected in the large variation of reported generic skills and learning categories combinations. For example, the skill respecting was changed more often on progress compared to the other learning categories, whereas this was on the self-level for the skill flexibility. Eleven skills (31% of all skills) were reported only by one or two students. The found variation could be a result of different students’ interests and perceived relevance of specific generic skills, as a constantly changing society would require different skills for different people and situations (Jääskelä et al., 2018). Another explanation is that individual students develop their skills differently. Remington-Doucette et al. (2013) for instance researched real-world problem-solving as key pedagogy for sustainability education purposes. They found that students with different disciplinary backgrounds developed key sustainability competencies differently. Badcock et al. (2010) found a significant difference in critical thinking and interpersonal understanding between students enrolled in one or two degrees, with both being higher for the students with a double enrolment. Overall, the complex problem-solving learning environment provided students with opportunities to develop their generic skills in a variety of ways.
Limitations and future research
A point for further discussion is the context of studying self-perceived learning. The advantages of using self-perceived learning in the domain of generic skills are that students can report on the skills that they experience as important and that it uncovers thoughts and attitudes. Knowledge about one’s self and being able to communicate this knowledge is an important part of life as well. Petruzziello et al. (2022) for instance argue that self-presentation ability, which refers to one’s ability to present their own expertise, should be considered as an important factor for success in job interviews. Self-perception of learning, which in the current paper was called progress, is however, not an objective overview of what is learned. Students are weak in self-assessment and are underestimating themselves when assessing challenging tasks (Chevalier et al., 2009). For the other learning categories, literature on the accuracy of students’ self-perceptions is not widely available. One suggestion for future research might therefore be to research the importance of objective observations when looking at the underlying mechanisms of behaviour, such as attitudes and thoughts. Another suggestion is researching the predictive validity of the different learning categories on self-reported generic skill improvement and on other generic skills measurements. It would be interesting to take the individual trajectories into account as well, to see whether different students learn generic skills differently (Remington-Doucette et al., 2013). A mixed methodology of quantitative and qualitative methods could help with getting a more accurate insight in what students learned during complex-problem solving.
Another point for further discussion is the connection between the skills we have found in our course and the skills that could be found in other contexts. Jääskelä et al. (2018) note that the development of generic skills is likely to differ between contexts. Relationships between the findings in the current study and the set-up of the used complex problem-solving course were also suggested in the discussion section. Researching the validity of the systematic in other contexts or fields might therefore be interesting and might yield other or more relevant learning categories. Sandars (2009) argues for instance that it is important for medical professionals to understand their own personal values and beliefs next to knowledge and skills. These more personal attributes are reflected in important professional attitudes, such as empathy, which might be an additional learning category in the context of medical education. Next to different learning categories, other domains might emphasise different generic skills compared to the generic skills found in our sample. Therefore, exploring challenges in other fields might provide new insights on generic skills learning in problem-solving contexts.
The last point for discussion is the practical use of the systematic. In the current paper, the selection of relevant phrases and the coding have been done by hand. A helpful improvement to save time would be to use text analysing software for this in the future. This, way the sample size can be increased as well, and more generalised conclusion might be drawn regarding the output of student’s generic skills learning.
Implications
The current study presented the Generic Skills Learning Systematic that can be used to evaluate students’ self-perceived generic skills learning. The systematic takes a broader perspective on skill learning by including not only students’ progress, but also their changes in attitude, knowledge, and intentions with reference to generic skills. The systematic could additionally inspire educational practices in generic skills development, both regarding the instruction and the assessment of students. For the instruction, the learning categories proposed in the systematic could be incorporated in the learning objectives of courses and programmes, based on the purpose of the education. For example, when an attitude change is desirable, focusing on the value would be beneficial. When students in a preparatory or introduction course are required to select follow-up courses themselves, the students benefit more from self-level and intentions objectives. A course focusing directly on the development of one or more skills, should next to progress incorporate understanding goals to reflect having knowledge of the skill. Making these learning goals explicit can help learners to direct their attention and effort to relevant activities for reaching these goals, instead of on irrelevant activities (Morisano et al., 2010). A good educational activity to help students direct their attention towards the learning goals is reflection. As reflection is an important and useful strategy for lifelong learning (Sandars, 2009), skills development (Harvey et al., 2010), and dealing with ill-defined problems (Kember et al., 1999; Rogers, 2001; Ramaley, 2014), providing students with opportunities to practice reflection can, therefore, help them in their future work as well. Instructing students clearly using the learning categories terminology could furthermore help students and educators to use the same concepts to describe and evaluate learning. Next to improving communication, it might also result in a reduction of the number of phrases that were too ambiguous to be coded in this paper.
For the assessment, new directions are needed when dealing with a large variety of learned generic skills, as presented in this paper. Skills assessment calls for time, resources, and expertise, which are often not directly available. Mylopoulos et al. (2016) emphasise the need to move away from the more traditional path of testing the performance level of students in a summative manner. The learning categories provide possibilities for less traditional assessment methods. For example, by using reflective questioning to assess students’ self-level or asking students to apply their understanding and self-level to formulate intentions in a report or conversation. Combining these different assessment methods and moving away from solely summative assessment, as is done for instance with programmatic assessment, would provide students with the most information about their learning and could in addition help with monitoring their learning process (Schuwirth and van der Vleuten, 2011).
To conclude, students in the presented complex problem-solving course are the creators of their own learning by learning those generic skills in the ways relevant to themselves. The proposed systematic in this paper helps with looking beyond a one-size-fits-all approach and to appreciate students’ differences in learning. By contributing to the knowledge about generic skills development and improving the practical use of evaluation methods, educators can continue improving education to prepare students for a complex and changing society. Overall, the Generic Skills Learning Systematic and findings of this study are intended to support educators in expanding their perspective on what learning entails and show the variety in relevant generic skills for the professionals of the future.
Data availability statement
The raw dataset is available by request, which includes the raw data used and coded from the learner reports. The 43 complete learner reports cannot be shared due to participants’ identifiable data in the reports (e.g., personal descriptions and situations).
Ethics statement
Ethical review and approval was not required for the study on human participants in accordance with the local legislation and institutional requirements. Written informed consent for participation was not required for this study in accordance with the national legislation and the institutional requirements.
Author contributions
HR, RB, and RK coded data and discussed the results. HR developed the systematic framework with the supervision of RB and RK. MS contributed to theoretical framework. GD and HVMR contributed to practical implications. All authors designed research project, interpreted the results, and commented on the manuscript.
Conflict of interest
The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.
Publisher’s note
All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article, or claim that may be made by its manufacturer, is not guaranteed or endorsed by the publisher.
Supplementary material
The Supplementary Material for this article can be found online at: https://www.frontiersin.org/articles/10.3389/feduc.2022.1007361/full#supplementary-material
References
Akkerman, S., Admiraal, W., Brekelmans, M., and Oost, H. (2008). Auditing quality of research in social sciences. Qual. Quant. 42, 257–274. doi: 10.1007/s11135-006-9044-4
Andrews, J., and Higson, H. (2008). Graduate employability, “soft skills” versus “hard” business knowledge: A European study. High. Educ. Europe 33, 411–422. doi: 10.1080/03797720802522627
Badcock, P. B. T., Pattison, P. E., and Harris, K.-L. (2010). Developing generic skills through university study: A study of arts, science and engineering in Australia. High. Educ. 60, 441–458. doi: 10.1007/s
Bakkenes, I., Vermunt, J. D., and Wubbels, T. (2010). Teacher learning in the context of educational innovation: Learning activities and learning outcomes of experienced teachers. Learn. Instr. 20, 533–548. doi: 10.1016/j.learninstruc.2009.09.001
Bennett, N., Dunne, E., and Carré, C. (1999). Patterns of core and generic skill provision in higher education. High. Educ. 37, 71–93.
Berg, T. B., Achiam, M., Poulsen, K. M., Sanderhoff, L. B., and Tøttrup, A. P. (2021). The role and value of out-of-school environments in science education for 21st century skills. Front. Educ. 6:674541. doi: 10.3389/feduc.2021.674541
Bridges, D. R., Davidson, R. A., Odegard, P. S., Maki, I. V., and Tomkowiak, J. (2011). Interprofessional collaboration: Three best practice models of interprofessional education. Med. Educ. Online 16:6035. doi: 10.3402/meo.v16i0.6035
Chamorro-Premuzic, T., Arteche, A., Bremner, A. J., Greven, C., and Furnham, A. (2010). Soft skills in higher education: Importance and improvement ratings as a function of individual differences and academic performance. Educ. Psychol. 30, 221–241. doi: 10.1080/01443410903560278
Chan, C. K. Y., Zhao, Y., and Luk, L. Y. Y. (2017). A validated and reliable instrument investigating engineering students’ perceptions of competency in generic skills. J. Eng. Educ. 106, 299–325. doi: 10.1002/jee.20165
Chevalier, A., Gibbons, S., Thorpe, A., Snell, M., and Hoskins, S. (2009). Students’ academic self-perception. Econ. Educ. Rev. 28, 716–727. doi: 10.1016/j.econedurev.2009.06.007
Crebert, G., Bates, M., Bell, B., Patrick, C. J., and Cragnolini, V. (2004). Developing generic skills at university, during work placement and in employment: Graduates’ perceptions. High. Educ. Res. Dev. 23, 147–165. doi: 10.1080/0729436042000206636
Dall’alba, G., and Sandberg, J. (2006). Unveiling professional development: A critical review of stage models. Thousand Oaks, CA: SAGE.
De Grez, L., Valcke, M., and Roozen, I. (2009). The impact of an innovative instructional intervention on the acquisition of oral presentation skills in higher education. Comput. Educ. 53, 112–120. doi: 10.1016/j.compedu.2009.01.005
Fishbein, M., and Ajzen, I. (2010). Predicting and changing behavior. New York, NY: Psychology Press.
Funke, J. (2010). Complex problem solving: A case for complex cognition? Cogn. Process. 11, 133–142. doi: 10.1007/s10339-009-0345-0
Gilbert, R., Balatti, J., Turner, P., and Whitehouse, H. (2004). The generic skills debate in research higher degrees. High. Educ. Res. Dev. 23, 375–388. doi: 10.1080/0729436042000235454
Harvey, M., Coulson, D., Mackaway, J., and Winchester-Seeto, T. (2010). Special issue of the asia-pacific journal of cooperative education work integrated learning (WIL): Responding to challenges aligning reflection in the cooperative education curriculum. Sydney, NSW: Macquarie University.
Hattie, J., Biggs, J., and Purdie, N. (1996). Effects of learning skills interventions on student learning: A meta-analysis. Rev. Educ. Res. 66, 99–136.
Jääskelä, P., Nykänen, S., and Tynjälä, P. (2018). Models for the development of generic skills in Finnish higher education. J. Further High. Educ. 42, 130–142. doi: 10.1080/0309877X.2016.1206858
Jones, A. (2009). Generic attributes as espoused theory: The importance of context. High. Educ. 58, 175–191. doi: 10.1007/s10734-008-9189-2
Kember, D., Jones, A., Loke, A., McKay, J., Sinclair, K., Tse, H., et al. (1999). Determining the level of reflective thinking from students’ written journals using a coding scheme based on the work of Mezirow. Int. J. Lifelong Educ. 18, 18–30. doi: 10.1080/026013799293928
Kohn Rådberg, K., Lundqvist, U., Malmqvist, J., and Hagvall Svensson, O. (2020). From CDIO to challenge-based learning experiences–expanding student learning as well as societal impact? Eur. J. Eng. Educ. 45, 22–37. doi: 10.1080/03043797.2018.1441265
Mason, G., Williams, G., and Cranmer, S. (2009). Employability skills initiatives in higher education: What effects do they have on graduate labour market outcomes? Educ. Econ. 17, 1–30. doi: 10.1080/09645290802028315
Matteson, M. L., Anderson, L., and Boyden, C. (2016). “Soft skills”: A phrase in search of meaning. Portal 16, 71–88. doi: 10.1353/pla.2016.0009
Morisano, D., Hirsh, J. B., Peterson, J. B., Pihl, R. O., and Shore, B. M. (2010). Setting, elaborating, and reflecting on personal goals improves academic performance. J. Appl. Psychol. 95, 255–264. doi: 10.1037/a0018478
Mylopoulos, M., Brydges, R., Woods, N. N., Manzone, J., and Schwartz, D. L. (2016). Preparation for future learning: A missing competency in health professions education? Med. Educ. 50, 115–123. doi: 10.1111/medu.12893
Nealy, C. (2005). Integrating soft skills through active learning in the management classroom. J. Coll. Teach. Learn. 2, 1–6.
Pache, A. C., and Chowdhury, I. (2012). Social entrepreneurs as institutionally embedded entrepreneurs: Toward a new model of social entrepreneurship education. Acad. Manag. Learn. Educ. 11, 494–510. doi: 10.5465/amle.2011.0019
Petruzziello, G., Chiesa, R., Guglielmi, D., van der Heijden, B. I. J. M., de Jong, J. P., and Mariani, M. G. (2022). The development and validation of a multi-dimensional job interview self-efficacy scale. Pers. Ind. Differ. 184:111221. doi: 10.1016/j.paid.2021.111221
Pintrich, P. R. (2002). The role of metacognitive knowledge in learning, teaching, and assessing. Theory Pract. 41, 219–225. doi: 10.1207/s15430421tip4104_3
Ramaley, J. A. (2014). The changing role of higher education: Learning to deal with wicked problems. J. High. Educ. Outreach Engagem. 18, 7–22.
Raybould, J., and Sheedy, V. (2005). Are graduates equipped with the right skills in the employability stakes? Ind. Commer. Train. 37, 259–263. doi: 10.1108/00197850510609694
Regueiro, B., Rodríguez-Fernández, J. E., Crespo, J., and Pino-Juste, M. R. (2021). Design and validation of a questionnaire for university students’ generic competencies (COMGAU). Front. Educ. 6:606216. doi: 10.3389/feduc.2021.606216
Remington-Doucette, S. M., Connell, K. Y. H., Armstrong, C. M., and Musgrove, S. L. (2013). Assessing sustainability education in a transdisciplinary undergraduate course focused on real-world problem solving: A case for disciplinary grounding. Int. J. Sustain. High. Educ. 14, 404–433. doi: 10.1108/IJSHE-01-2012-0001
Rittel, H. W. J., and Webber, M. M. (1973). Dilemmas in a general theory of planning. Policy Sci. 4, 155–169.
Rogers, R. R. (2001). Reflection in higher education: A concept analysis. Innov. High. Educ. 26, 37–57.
Rule, A. C. (2006). Editorial: The Components of Authentic Learning. J. Authen. Learn. 3, 1–10. doi: 10.1016/j.media.2021.101994
Sandars, J. (2009). The use of reflection in medical education: AMEE Guide No. 44. Med. Teach. 31, 685–695. doi: 10.1080/01421590903050374
Santos Rego, M. A., Mella Núñez, Í, Naval, C., and Vázquez Verdera, V. (2021). the evaluation of social and professional life competences of university students through service-learning. Front. Educ. 6:606304. doi: 10.3389/feduc.2021.606304
Schuwirth, L. W. T., and van der Vleuten, C. P. M. (2011). Programmatic assessment: From assessment of learning to assessment for learning. Med. Teach. 33, 478–485. doi: 10.3109/0142159X.2011.565828
Shuman, L. J., Besterfield-Sacre, M., and McGourty, J. (2005). The ABET “professional skills” – can they be taught? can they be assessed? J. Eng. Educ. 94, 41–55.
Touloumakos, A. K. (2020). Expanded yet restricted: A mini review of the soft skills literature. Front. Psychol. 11:2207. doi: 10.3389/fpsyg.2020.02207
Tynjälä, P. (2008). Perspectives into learning at the workplace. Educ. Res. Rev. 3, 130–154. doi: 10.1016/j.edurev.2007.12.001
van Kesteren, B. J. (1989). Gebruiksmogelijkheden van het Learner Report. Tijdschrift Onderwijsresearch 14, 13–29.
Veltman, M. E., van Keulen, J., and Voogt, J. M. (2019). Design principles for addressing wicked problems through boundary crossing in higher professional education. J. Educ. Work 32, 135–155. doi: 10.1080/13639080.2019.1610165
Virtanen, A., and Tynjälä, P. (2019). Factors explaining the learning of generic skills: A study of university students’ experiences. Teach. High. Educ. 24, 880–894. doi: 10.1080/13562517.2018.1515195
Ward, P., Gore, J., Hutton, R., Conway, G. E., and Hoffman, R. R. (2018). Adaptive skill as the conditio sine qua non of expertise. J. Appl. Res. Mem. Cogn. 7, 35–50. doi: 10.1016/j.jarmac.2018.01.009
Keywords: generic skills, university students, learner reports, complex problem-solving, development, university course
Citation: van Ravenswaaij H, Bouwmeester RAM, van der Schaaf MF, Dilaver G, van Rijen HVM and de Kleijn RAM (2022) The generic skills learning systematic: Evaluating university students’ learning after complex problem-solving. Front. Educ. 7:1007361. doi: 10.3389/feduc.2022.1007361
Received: 30 July 2022; Accepted: 11 October 2022;
Published: 03 November 2022.
Edited by:
Ana Teresa Ferreira Oliveira, Instituto Politécnico de Viana do Castelo, PortugalReviewed by:
Lyubov Naydonova, Institute for Social and Political Psychology of NAES of Ukraine, UkraineNuno Almeida, Polytechnic Institute of Leiria, Portugal
Copyright © 2022 van Ravenswaaij, Bouwmeester, van der Schaaf, Dilaver, van Rijen and de Kleijn. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.
*Correspondence: Heleen van Ravenswaaij, aC52YW5yYXZlbnN3YWFpai0yQHVtY3V0cmVjaHQubmw=