Skip to main content

PERSPECTIVE article

Front. Complex Syst., 15 February 2024
Sec. Complex Systems Theory
This article is part of the Research Topic Insights In Complex Systems Theory View all 7 articles

What about adaptiveness? The case of organisational resilience and cognition

  • Research Centre for Computational and Organizational Cognition, Department of Culture and Language, University of Southern Denmark, Slagelse, Denmark

This paper makes the very simple, perhaps straightforward point that adaptiveness cannot be taken for granted when analysing a complex system. The paradigm of Complex Adaptive Systems (CAS) theory makes it clear that a key feature of complex systems is the ability to adapt to changes in their environment. This is, indeed, relevant to many systems (e.g., living and social systems) since change is embedded in the way in which systems evolve over time. At the same time, adaptiveness is a strong assumption to make, since it prioritises change over stability and it can be a straight jacket, especially when it comes to studying complexity in the context of human social organising. By using a Case Study, this paper highlights the limits of a focus on adaptiveness and pushes for a more “neutral” perspective that allows researchers to appreciate a wider set of mechanisms, norms, and behaviours pertaining to complex social systems.

1 Introduction

This article is concerned with a specific and central assumption in Complex Adaptive Systems (CAS) theory and the way in which it applies to certain domains of the social sciences. According to proponents of CAS, there is a feature—typical of biological systems—that allows entities to “fit” in a particular set of environmental and internal conditions such that the result is a move away from the initially observed conditions of the system. This can be—and it is within this perspective—named adaptiveness, and it reflects an alignment between a mix of exogenous factors and the system.1

This assumption remains relatively unchallenged in CAS and especially among those who use this approach in their computational simulation work, namely, agent-based and system dynamics modellers (e.g., Miller and Page, 2007; Yolles, 2018). On the one hand, it is rather obvious that systems adapt, in the sense that they change in relation to internal and external conditions. But this characteristic is not exclusive of complex systems. For instance, simple mono-cellular systems adapt, too. Therefore, we are referring to something else here. In fact, the adaptiveness of a complex system increases the likelihood that the system is unpredictable, since both the changes and their results are difficult to map on a linear set of coordinates. In other words, the interplay among the elements of a system makes it difficult to determine how individual modifications affect the entire system. In this context, the term “adaptive” is fundamental to understanding these complex systems.

On the other hand, an emphasis on adaptiveness may neglect other (perhaps equally important) elements, including those aspects that allow the system to maintain its identity. Some may argue that adaptiveness is necessary to ensure system resilience (e.g., Chakwizira, 2022), i.e., those aspects that maintain the system unaltered in its core functionality (and/or in its ontology). However, such an argument centers the discussion on the functions of adaptiveness by asking what is adaptiveness for? Yet, this question keeps the emphasis on change within a system, leaving unchanged aspects aside. This may not be intentional, but simply the result of a particular analytical and descriptive focus on one constitutive aspect of a complex system.

Against the backdrop of the above considerations, this paper initiates a line of enquiry on the (relative) rigidity of complex systems as a way to contrast and re-balance the discourse on adaptiveness. Since social systems are among those that can be better described with a CAS framework, this paper draws its examples from a Case Study on organisational resilience and how it is achieved.

After providing a summary of the history of core ideas that led to CAS, we discuss how cognition in the maintenance department of a large Danish utility company remains dependent on elements of stability in order for it to flexibly adapt to its environment. A few concluding remarks summarise the article’s insights.

2 Background

The beginning of complex systems research can roughly be dated back to the 1990s with some predecessors such as artificial life already in the 1980s (Manson, 2001). The roots of cybernetics, which were shaped in the 1950s by scientists like John von Neumann, Warren McCulloch, Norbert Wiener, or Heinz von Foerster (van Dijkum, 1997; Vallée, 2009), laid the foundation for a field that, about 30 years ago, benefited from new mathematical and computational tools and techniques such as agent-based simulation, fractal analysis, and chaos theory. This historical result enabled a quantitative analysis of patterns which could only be discovered by the use of these new tools.

Complex systems research promised to become a new interdisciplinary research field concerned with patterns that can be identified across scientific disciplines and domains such as ecosystems, societies, traffic, financial markets, opinion formation, epidemic spreading, and the Internet and social media (Thurner et al., 2018), to name a few. Complex systems research investigates the behavioural properties of many interacting elements that generate open systems. Most importantly, the behaviour of complex systems is unpredictable. Basic features characterising these systems are non-linearity, emergence, adaptation, feedback loops, multiple scales, thresholds and tipping points, dynamic behaviour, relative path dependency, or self-organised criticality, including zones of high and low stability. Because of their ability to adapt to varying environmental circumstances, complex systems are interchangeably referred to as complex adaptive systems (CAS) (Miller and Page, 2007; Carmichael and Hadžikadić, 2019).

During the 1990s, system theorists—often associated with the Santa Fe Institute—such as Stuart Kauffman (1993), John Holland (2000), or Christopher Langton (1997), shaped the new view on complex systems: It was considered sufficient to describe elements interacting on the micro level by relatively simple rules, which generate patterns at the macro level, and then analyse these using the analytical and mathematical tools of complexity theory (Ellis and Herbert, 2010). For a long time, this analytical approach was the unifying principle of complex systems research, traversing the boundaries of scientific disciplines such as physics, chemistry, or biology.

A major problem that became apparent by applying a systems’ theory perspective is the question of leverage points: even small changes in the system or the systems’ environment my induce drastic changes in the systems’ behaviour. Therefore, knowledge about such leverage points is of utmost importance for managing complex systems. However, due to the properties of complex systems, it is often hard to predict if and where such a leverage point is reached. Nevertheless, some rules-of-thumb can be provided where interventions in systems are more or less likely to induce more or less dramatic changes in the systems’ behaviour. For example, Donella Meadows suggests a typology ranging from changes in the system’s parameters, which are unlikely to induce massive change, to the mindset or paradigm of the entire system, which most likely will generate massive change (see Meadows, 1999).

Since its early days, research on CAS attracted and was attracted by the social sciences (Byrne and Callaghan, 2014). Clearly, society is an open system with many interacting elements including organisations, individual actors, groups, ecological phenomena, etc. Moreover, the different elements of society—e.g., markets (Tesfatsion, 2002), organisations (Fioretti, 2013), institutions (Greenwood et al., 2011), teams (Ramos-Villagrasa et al., 2018) — can also be considered complex systems in their own right. Put in technical terms, society is made of a series of nested complex systems (see, e.g., Miller and Page, 2007). Thus, a society is a complex system and certainly, also a complex adaptive system, since societies (and their parts) adjust to varying environmental and societal circumstances.

Already in the mid-1990s, groundbreaking work such as ‘growing artificial societies’ (Epstein and Axtell, 1996) or Axelrod’s ‘complexity of cooperation’ (Axelrod, 1997) was emerging, which aimed to investigate social phenomena from the perspective of complexity science. Since then, this approach has been used to investigate all kinds of social phenomena across disciplines such as sociology, political science, as well as economics and organisation science. Analyses of social systems using complexity science tools are less concerned with phenomena themselves and more focused on their underlying generative principles. In particular, in the social sciences, complexity theory research is distinguished by its methodological choice of simulation experiments, chiefly using Agent-Based Modeling (ABM), to produce generative explanations.

The notion of generative explanations (Epstein, 2012) builds on the idea that local interactions of heterogeneous individual agents on the micro level may generate certain patterns such as equilibrium prices or residential segregation at the macro level. When simulation experiments—especially those using ABM—demonstrate that a pattern can be generated through such interactions, it is suggested that these interactions, governed by the rules of the simulation model, offer a potential explanation for the macro-level pattern (Epstein, 2012). Yet, it is a candidate explanation because it is not possible to exclude that other interactions based on different rules may generate the same pattern. It is claimed, however, that explanations of social macro level phenomena must show that and how these can be generated “from the bottom-up” by interacting agents on a micro level. In this way, simple rules on a micro level should generate complex patterns on a macro level in the context of human social systems.

3 Discussing adaptiveness

It is commonly assumed that complex systems exhibit characteristics that are slightly different in relation to the domain in which they are observed, as well as across domains. Generalisation is, under these lenses, an extremely difficult task. Nevertheless, the observation of an aspect pertaining to one domain may lead to enquiries and, perhaps, to the study of analogies in other domains.

It is with these considerations in mind that this paper uses a Case Study to discuss a number of aspects related to the nature of adaptiveness in complex systems. The description reported below is based on ethnographic data and observations recorded between August 2022 and June 2023.

3.1 Case study description

The complex system in question is a public service organisation, namely, a large Danish utility company. Among other services, this organisation offers district heating to households in the Greater Copenhagen Region. The company is responsible for a network of heating pipes roughly covering the entire region. The case focuses on the maintenance department, with three teams and a total of approximately 50 employees. These employees are in charge of monitoring the piping network and coordinating repairs where necessary.

In 2018, the department reached an agreement with an external company for gathering leakage data by using unmanned aerial vehicles (or simply drones) equipped with thermal imaging cameras. The drones are able to detect areas where hot water is leaking due to fractures or wear and tear of the pipes. The amount of leakage data now available is vast compared to what the department had access to before using drones (ca. 5 times more data per year; see Cowley and Gahrn-Andersen, 2023; Gahrn-Andersen, 2020).

The leakage data collected with drones is uploaded to a bespoke software called Teraplan, which offers interactive radiometric orthomaps of all known leakages. Prior to the introduction of drones, no visual representation of leakages was available and the utility company had no means of collating data. The influx of data required a change of processes, routines, and team-based approach related to detection, handling, and closing of cases. With the new drone leakage detection system, the utility company now has an overview of the network’s condition in the form of points of interest such as critical leakages, plus a general thermal imprint from the entire pipeline network. This overview makes it possible for the company to have a more proactive approach to network maintenance, as the organisation now relies on an abundance of thermographic data on thousands of leakages to plan repairs and take preventive actions in their maintenance operations. Before the change, maintenance work could only be reactive in the sense that the company did not have an overview of leakages and had to wait for either a customer call, a random check, or other ad hoc inputs. Therefore, not only is the increase in volume of input data significant, but the quality and nature of the work around maintenance differs from what was previously done. As a demonstration of the disruptive nature of this change, one of the employees remarks:

“So back to something that I said earlier that we do not plan ahead a whole lot. But [employee responsible for drone cases] is actually the only one planning ahead a whole lot, because he knows he has to share the diggers, and he has to share the welders with us, so he has to plan, and mark areas where he maybe takes 2 As and a B [leakages identified by drones are ranked based on severity of water loss, where A is the most severe] and say: ‘These three are in close proximity to each other, please go and dig all three up’. That’s what we’ve been working with the last years, that’s what we are working towards.” (Operations and Maintenance Engineer).

Effectively, this means that leakages are now frequently grouped (clustered) on the basis of their geographical proximity, and they are dealt with as ‘bundles of leakages’ rather than as isolated occurrences (which was the standard in the old leakage-detection practice). Furthermore, the drones give an awareness of leakages that would not have been possible without the technology because, first, they can photograph private properties without violating privacy and, second, leakages are identified when the water is still in the insulation or in the ground, before surfacing:

“And this is something that the drones have had a big impact on because they were able to find all the leakages that we don’t get to see because it’s just going into the ground and not coming up anywhere to show.” (Team Leader)

In the terms of Meadows, what we are witnessing here is something akin to a mindset or paradigm shift in the sense that it sets up novel systemic trajectories and, hence, alters fundamental aspects of the leakage-detection practice (Cowley and Gahrn-Andersen, 2023). Or, as Meadows (1999) puts it:

“Systems folks would say you change paradigms by modeling a system, which takes you outside the system and forces you to see it whole. We say that because our own paradigms have been changed that way.”

In this connection, we can say that the Teraplan software enables the modeling of the system (namely, the complex network of pipes, residences, roads and other critical infrastructures). However, we use Meadows’ mentalist parlance with caution, since the management of the utility company has had no overall plan for how the drone data is used nor for how they can appeal to collective representations of the practice. The change is simply something that happens incrementally as the utility company is trying to understand how to best make use of the new data and how to construct knowledge about leakages around them. Another clear indicator of this is that drone-based data has not fully substituted leakage-repair based on notifications from passers-by.

3.2 Effects on the system

This subsection is dedicated to speculative arguments around possible and plausible system (i.e., organisational) change, given the scenario above. In other words, we answer the question: If an organisation is conceived as a complex adaptive system, how would one then expect it to adapt when it faces change in its core operations? In other words, what would be a theoretically viable path for a CAS-inspired perspective?

A change such as the one described above is likely to exercise a shock on an organisation. Data on leakages are, from the perspective of the maintenance department, the basic information (i.e., the input) on which operations are performed. Hence, it is reasonable to assume that such a change in the quality and quantity of inputs would, in turn, cause changes in the organisational structure and procedures.

3.2.1 Structure

From an analytical perspective, such a massive change in data quality, structure, and availability should be reflected in the way in which knowledge is framed, understood, and handled in the organisation. This would undoubtedly require structural changes that would allow the organisation to act upon them. For example, it is plausible that the tasks related to data screening and handling multiply while those allocated to (traditional) leakage detection are reduced. The organisation may well decide to start a transition where action (e.g., fixing leakages) becomes more and more oriented towards drone-generated data rather than traditional means of data collection. This description is consistent with the principle of self-organisation in CAS (Miller and Page, 2007; Byrne and Callaghan, 2014) and, in this specific case, it would require a shift towards a different setting where teams are rebuilt around more data-driven structures. In other words, self-organisation is a way to achieve adaptation.

3.2.2 Procedures

Certainly, the way in which the drone-generated data flow in the organisation requires channels that are different from the ones already in place. Since these data streams have considerably higher volumes, offer more precision and details, and are generated at higher speed, they can be considered “Big Data” and, as experts advise, they require analytical techniques that are different from those employed for standard data. For this reason, it is wise for an organisation to adapt its procedures in order to make sense of the data and increase its efficiency—how they use the information they now have—and effectiveness—how they fix more cases than they used to. One could imagine that these new procedures would be emergent properties of the CAS, as they “grow up” or are generated by practice and interactions among employees/agents (Epstein and Axtell, 1996; Epstein, 2012).

3.3 What is missing?

The perspective described above is consistent with features of CAS theory and with a general approach to systems that is change-bound. But, is this actually plausible? In other words, is it plausible to have adaptiveness as a benchmark as opposed to approaching a given system with a more “neutral” perspective?

The claim here is that there are at least two concerns deriving from an emphasis on adaptiveness: one relates to resilience, and the other to the mid-layers of a system.

3.3.1 Understanding resilience

In the case of the utility company, the introduction of drones may be perceived as an antecedent to possible structural and procedural changes. Even though it is not necessary for a complex system to react to stimuli from its environment, such changes are likely—and even expected—when the stimuli are perceived to be radical, massive, or to affect core aspects of the system.

However, a system—and especially a social system—evolves dynamically over time together with and, at the same time, independent of its environment. In other words, there is co-evolution (Ellis and Herbert, 2010) but also i-evolution—the independent (internal) modifications that naturally occur in a system, mainly due to time and the internal conditions that characterise its elements. The juxtaposition of these two types of evolution seems contradictory and it requires an explanation. There are simultaneous forces exerted on a social system such as an organisation. Some forces pull the organisation in the same direction as other organisations in the environment and indicate the prevalence of an institutional logic or organisation-environment co-evolution (e.g., Greenwood et al., 2011). Other forces pull the organisation away, instead. These can be due to the trajectory an organisation has set for themselves, due to, for example, strategic planning. Internal forces or parallel (perhaps recessive) environmental forces play a key role here. For these reasons, an organisation is constantly confronted with competing and sometimes divergent evolutionary forces.

The main argument here is that it is difficult, if not impossible, to predict which forces will affect a complex system, when, and how. The nature of the interacting forces is, by definition, complex. Such a system would not necessarily adapt. Or, to be more precise, its path to adaptation could be unique, original, unorthodox, and oriented towards preservation or non-existence. Preservation could be oriented towards parts of the system such as processes or procedures, but not necessarily to the system as a whole. In fact, it is sometimes perfectly acceptable that a system—an organisation—sees more value in ceasing to exist rather than continuing to struggle (and, e.g., create another organisation).

In the case study above, the organisation did not adapt as such. Instead, it applied existing logic and procedures to the vast amount of drone-generated data, thus gradually gaining a foothold. This allowed the organisation to reduce the need for change, limit investments, save resources, and maintain its workforce and capabilities. Naturally, some new procedures were introduced in order to cope with the new data streams, but these did not become central to the organisation’s operations, nor have the drone-generated data become the sole source of leakage information. From this perspective, one could argue that the organisation showed stationarity and ergodicity in relation its to structure and procedures. In other words, i-evolution forces seem to prevail against co-evolution, or helped to “decode” co-evolutionary forces in a way that is more suitable to the organisation.

The ability to maintain functionalities and operations can be referred to as resilience. In this organisational system, resilience is much more anchored to practice and affordances than one might expect (Gahrn-Andersen, 2023). However, this does not necessarily mean that the organisation’s behaviour is more predictable—since one expects change through a CAS perspective—but it does bring in the perspective that a better understanding comes from both stability and change.

3.3.2 The meso domain

One of the claims advanced in the last paragraphs is that an organisation may have different objectives, practices, norms, expectations, etc., depending on its elements (or parts). Not only is this due to structural differences that may exist within and across teams, groups, divisions, or departments, but it is also a result of sentient beings interacting among each other, and with the structures, resources, tools, etc., available to them in the organisation. These socio-cognitive phenomena allow individuals to make sense of their surroundings (Hutchins, 1995; Hutchins, 2014) and are, when considered individually, complex systems in their own right. Thus, an organisation is a complex system comprised by large numbers of other complex systems.

To some, the in-between meso is comprised mainly of norms and rules that allow individuals to relate to each other (Yolles, 2018). More generally, they are what constitutes the system in that they establish the relations between the whole and its elements. The meso is therefore the glue that keeps the system together.

More than that, the meso can be framed as a domain in which sociality-based cognition actually takes place (Secchi and Cowley, 2021). Not only are norms enforced here, but practices, behaviours, thinking, and action materialise in a such a way that it allows the organisation to function. The meso is a continuous flow where meaning is constantly (re)defined or (re)iterated (Secchi et al., 2022). In the context of the utility company, this implies that the drone-generated data were interpreted through existing organisational practices. Because the new leakage data source was devised as a mere add-on to existing maintenance work, no significant concerns were raised over the need to reorganise existing practices. The only adjustment made was assigning an employee to handle the new data in a manner that the knowledge generated from these data would ultimately align to data from traditional sources and with existing repair operations and procedures. Thus, the richness, different quality, and high volume of drone-generated data had only insignificant effects, if any, on the core aspects of the organisation (see, e.g., Secchi et al., 2024). From this perspective, one may argue that the data adapted to the organisation rather than the other way around.

4 Concluding remarks

In this paper, we thematise the relation between adaptiveness and stability in complex systems, arguing that organisations do not mechanistically adapt to their environments. Rather, organisations—or other complex social systems—have a high degree of internal complexity, which ensures some level of stability over time. This, we argue, allows organisations to be resilient in the face of external and internal influences, especially when confronted with exogenous shocks such as introducing potentially disruptive technologies. Although it makes little sense to explore macro-micro relations without a focus on the socio-cognitive dynamics that give rise to them (i.e., the meso), it is important to notice that their interdependencies may be of a different order. In some cases they do entail that norms, procedures, practices, and even organisational identity are changed (i.e., adapt), while in others, these phenomena may remain identical over time, or what we refer to as “stability through (re)iteration”. Thus, we argue, the role of stability in adaptiveness is a topic in need of further exploration.

Data availability statement

The raw data supporting the conclusions of this article will be made available by the authors, without undue reservation.

Ethics statement

The studies involving humans were approved by Research & Innovation Organisation - University of Southern Denmark (Review No. 11.583). The studies were conducted in accordance with the local legislation and institutional requirements. Written informed consent for participation was not required from the participants or the participants legal guardians/next of kin in accordance with the national legislation and institutional requirements.

Author contributions

DS: Conceptualization, Funding acquisition, Writing–original draft, Writing–review and editing. MN: Conceptualization, Writing–original draft, Writing–review and editing. MF: Investigation, Writing–review and editing. RG-A: Funding acquisition, Writing–review and editing.

Funding

The author(s) declare financial support was received for the research, authorship, and/or publication of this article. This article benefited from funding awarded by the VELUX Foundations for the project Determinants of Resilience in Organisational Networks (DRONe) (grant number: 38917).

Acknowledgments

The authors wish to thank the managers and employees of the large Danish utility company that participated in the study and allowed for the ethnographic data to be collected.

Conflict of interest

The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

The author(s) declared that they were an editorial board member of Frontiers, at the time of submission. This had no impact on the peer review process and the final decision.

Publisher’s note

All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article, or claim that may be made by its manufacturer, is not guaranteed or endorsed by the publisher.

Footnotes

1Typically, adaptation also pertains to the components of CAS, mainly referred to as agents (Tesfatsion, 2002; Miller and Page, 2007; Orange, 2019). The points discussed in this article extend to other CAS components, although our concern here is adaptiveness in relation to macro system dynamics.

References

Axelrod, R. (1997). “Advancing the art of simulation in the social sciences,” in Simulating social phenomena. Editors R. Conte, R. Hegselmann, and P. Terna (New York: Springer-Verlag), 21–40.

CrossRef Full Text | Google Scholar

Byrne, D., and Callaghan, G. (2014). Complexity theory and the social sciences. The state of the art. Abingdon, UK: Routledge.

Google Scholar

Carmichael, T., and Hadžikadić, M. (2019). The fundamentals of complex adaptive systems. Germany: Springer.

Google Scholar

Chakwizira, J. (2022). Stretching resilience and adaptive transport systems capacity in South Africa: imperfect or perfect attempts at closing covid-19 policy and planning emergent gaps. Transp. Policy 125, 127–150. doi:10.1016/j.tranpol.2022.06.003

PubMed Abstract | CrossRef Full Text | Google Scholar

Cowley, S. J., and Gahrn-Andersen, R. (2022). How systemic cognition enables epistemic engineering. Front. Artif. Intell. 5, 960384. doi:10.3389/frai.2022.960384

PubMed Abstract | CrossRef Full Text | Google Scholar

Ellis, B. S., and Herbert, S. (2011). Complex adaptive systems (cas): an overview of key elements, characteristics and application to management theory. J. Innovation Health Inf. 19, 33–37. doi:10.14236/jhi.v19i1.791

PubMed Abstract | CrossRef Full Text | Google Scholar

Epstein, J. M. (2012). Generative social science: studies in agent-based computational modeling. Cambridge, MA: Princeton University Press.

Google Scholar

Epstein, J. M., and Axtell, R. (1996). Growing artificial societies social science from the bottom up. Cambridge, MA: MIT Press.

Google Scholar

Fioretti, G. (2013). Agent-based simulation models in organization science. Organ. Res. Methods 16, 227–242. doi:10.1177/1094428112470006

CrossRef Full Text | Google Scholar

Gahrn-Andersen, R. (2020). Making the hidden visible: handy unhandiness and the sensorium of leakage-detecting drones. Senses Soc. 15, 272–285. doi:10.1080/17458927.2020.1814563

CrossRef Full Text | Google Scholar

Gahrn-Andersen, R. (2023). Informational resilience in the human cognitive ecology. Entropy 25, 1247. doi:10.3390/e25091247

PubMed Abstract | CrossRef Full Text | Google Scholar

Greenwood, R., Raynard, M., Kodeih, F., Micelotta, E. R., and Lounsbury, M. (2011). Institutional complexity and organizational responses. Acad. Manag. Ann. 5, 317–371. doi:10.5465/19416520.2011.590299

CrossRef Full Text | Google Scholar

Holland, J. H. (2000). Emergence: from chaos to order. USA: OUP Oxford.

Google Scholar

Hutchins, E. (1995). Cognition in the wild. Cambridge, MA: MIT Press.

Google Scholar

Hutchins, E. (2014). The cultural ecosystem of human cognition. Philos. Psychol. 27, 34–49. doi:10.1080/09515089.2013.830548

CrossRef Full Text | Google Scholar

Kauffman, S. A. (1993). The origins of order: self-organization and selection in evolution. USA: Oxford University Press.

Google Scholar

Langton, C. G. (1997). Artificial life: an overview.

Google Scholar

Manson, S. M. (2001). Simplifying complexity: a review of complexity theory. Geoforum 32, 405–414. doi:10.1016/s0016-7185(00)00035-x

CrossRef Full Text | Google Scholar

Meadows, D. (1999). Leverage points. Places Intervene a Syst. 19, 28.

Google Scholar

Miller, J. H., and Page, S. E. (2007). Complex adaptive systems. An introduction to computational models of social life. Princeton, NJ: Princeton University Press.

Google Scholar

Orange, V. (2019). Supercomplexity in interaction: an introduction to the 4Es. Cham: Palgrave Pivot.

Google Scholar

Ramos-Villagrasa, P. J., Marques-Quinteiro, P., Navarro, J., and Rico, R. (2018). Teams as complex adaptive systems: reviewing 17 years of research. Small Group Res. 49, 135–176. doi:10.1177/1046496417713849

CrossRef Full Text | Google Scholar

Secchi, D., and Cowley, S. J. (2021). Cognition in organisations: what it is and how it works. Eur. Manag. Rev. 18, 79–92. doi:10.1111/emre.12442

CrossRef Full Text | Google Scholar

D. Secchi, R. Gahrn-Andersen, and S. J. Cowley (2022). “Organizational cognition: the theory of social organizing,” Of routledge studies in organizational change and development (New York and Abingdon: Routledge), 28.

Google Scholar

Secchi, D., Gahrn-Andersen, R., Festilia, M. S., and Neumann, M. (2024). “Multiple systems in the meso domain: a study in organizational cognition,” in Multiple systems: complexity and coherence in ecosystems, collective behavior, and social systems. Editors G. Minati, and M. P. Penna (Germany: Springer Nature). chap. 17. 209–218.

CrossRef Full Text | Google Scholar

Tesfatsion, L. (2002). Agent-based computational economics: growing economies from the bottom up. Artif. life 8, 55–82. doi:10.1162/106454602753694765

PubMed Abstract | CrossRef Full Text | Google Scholar

Thurner, S., Hanel, R., and Klimek, P. (2018). Introduction to the theory of complex systems. USA: Oxford University Press.

Google Scholar

Vallée, R. (2009). History of cybernetics. Syst. Sci. Cybern. 3, 22–34.

Google Scholar

van Dijkum, C. (1997). From cybernetics to the science of complexity. Kybernetes 26, 725–737. doi:10.1108/03684929710169898

CrossRef Full Text | Google Scholar

Yolles, M. (2019). The complexity continuum, part 1: hard and soft theories. Kybernetes 48, 1330–1354. doi:10.1108/k-06-2018-0337

CrossRef Full Text | Google Scholar

Keywords: social systems, social sciences, organisational cognition, nested complexity, adaptiveness

Citation: Secchi D, Neumann M, Festila MS and Gahrn-Andersen R (2024) What about adaptiveness? The case of organisational resilience and cognition. Front. Complex Syst. 2:1329794. doi: 10.3389/fcpxs.2024.1329794

Received: 29 October 2023; Accepted: 05 February 2024;
Published: 15 February 2024.

Edited by:

Christian Beck, Queen Mary University of London, United Kingdom

Reviewed by:

Cedric de Coning, Norwegian Institute of International Affairs (NUPI), Norway

Copyright © 2024 Secchi, Neumann, Festila and Gahrn-Andersen. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

*Correspondence: Davide Secchi, c2VjY2hpQHNkdS5kaw==

Disclaimer: All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article or claim that may be made by its manufacturer is not guaranteed or endorsed by the publisher.