Skip to main content

REVIEW article

Front. Digit. Health, 05 March 2024
Sec. Digital Mental Health
This article is part of the Research Topic Virtual Presence: Loneliness, technology and the production of human (dis)connectedness View all 9 articles

Digital loneliness—changes of social recognition through AI companions

  • 1Department of Philosophy, Ethics, and Religious Studies, Faculty of Humanities and Human Sciences (Graduate School), University of Hokkaido, Sapporo, Japan
  • 2Center for Human Nature, Artificial Intelligence, and Neuroscience (CHAIN), University of Hokkaido, Sapporo, Japan

Inherent to the experience of loneliness is a significant change of meaningful relatedness that (usually negatively) affects a person's relationship to self and others. This paper goes beyond a purely subjective-phenomenological description of individual suffering by emphasizing loneliness as a symptomatic expression of distortions of social recognition relations. Where there is loneliness, a recognition relation has changed. Most societies face an increase in loneliness among all groups of their population, and this sheds light on the reproduction conditions of social integration and inclusion. These functions are essential lifeworldly components of social cohesion and wellbeing. This study asks whether “social” AI promotes these societal success goals of social integration of lonely people. The increasing tendency to regard AI Companions (AICs) as reproducers of adequate recognition is critically discussed with this review. My skepticism requires further justification, especially as a large portion of sociopolitical prevention efforts aim to fight an increase of loneliness primarily with digital strategies. I will argue that AICs rather reproduce than sustainably reduce the pathodynamics of loneliness: loneliness gets simply “digitized.”

1 Introduction: the digital turn of social recognition

It is evident that a digitalization of human relations is affecting our social recognition relations, particularly with respect to the conditions of social integration and participation. What on the one hand may lead to stronger networking, the possibility of faster exchange of information and social visibility, and empowerment, comes with the social challenges posed by loneliness as a social pathology (1, 2). The sociopathological dimension of loneliness becomes visible not only in the (inter-)subjective phenomenality of suffering but is also assessed against the backdrop of systemically induced changes of the lifeworld, such as the use of highly advanced social AI. For highly digitized societies—think for instance of Japan, whose industry invests millions in the production of x-bots1 to compensate for the shortage of skilled nursing staff2—AI Companions (AICs) are apparently perceived as an adequate technical (systemic) strategy for dealing with societal problems such as endemic loneliness anticipated in the future. In an ideal scenario, social AI would contribute to a change of systemic and lifeworldly structures in the direction of the enhanced social integration of lonely people, and future loneliness prevention. This makes it necessary to critically assess the effects of digital solutions for global “loneliness management,” particularly against the backdrop of current medical and epidemiological research: Persistent experiences of loneliness (3) are considered alongside other well-known factors such as poor nutrition, stress, noise, or low socioeconomic status (4) for their pathogenic potential [cf. (58)] so that effective loneliness prevention would significantly change the onset, manifestation, and persistence of specific illnesses [e.g., depression; cf. (911)] in recent societies. Moreover, a sheer quantity of social relationships and optimal networks between people nevertheless apparently cannot prevent people from feeling lonely, isolated, and socially excluded (12). This correlates with individual disposition and resilience factors, e.g., with certain personality traits (8, 13) as empirical studies have shown (14), and depends on the personal attitude according to which someone evaluates one's own situation as loneliness. When non-trivial3 suffering from loneliness occurs, i.e., when loneliness is not only recognized as a potential threat to individual health alone, but also seen for the particular social miseries that it produces, it can be reconsidered as a new form of precariat. Apparently particularly vulnerable social groups [e.g., the ill, the elderly, and the socioeconomically disadvantaged, particularly children (1517) and young adults (18)], are especially affected by it. This requires interdisciplinary societal (e.g., healthcare, political, medical, ethical, etc.) intervention strategies4.

Many interventionist strategies for reducing social isolation (19) have been formulated over the years. Loneliness prevention is a top goal of public health and “built environment” projects (20), while it has just recently become a major topic of global healthcare politics and governmental action (21, 22). What current interventionist approaches could stress more is that loneliness is a symptom of disturbed social recognition relations. I will explain this in a first step in specifying digital loneliness as implying significant alterations of meaningful relatedness (2325). Although there are also some positive, affirmative readings of loneliness—for instance, its praise as solitude (26, 27)5—this paper is concerned rather with its shadow side. Problematic changes of personal relationships in the digital age have been already outlined with respect to the alienation phenomena [e.g., in terms of “self-commodification” (29), “acceleration” (30), and “atomization” (31)] under the auspices of a cultural-reflexive analysis.

This allows one to specify loneliness as a painful experience of a lack of or false social recognition in the context of a (cultural) theory of the digital. My question is whether AIC might be able to compensate for and even provide some effective forms of social recognition to prevent such experiences of alienation. One can be concerned with the increasing tendency to regard AICs as reproducers of adequate recognition. Despite all the fascination with AICs, this philosophical-critical review asks whether these are changing a certain understanding of social recognition in a lasting, maybe not solely positive way. The digitalization of detachment for which I believe human–AIC relations paradigmatically stand can be highlighted: What appears surmountable with an AIC as a relational artifact (32) is actually often reproduced by it: digital loneliness! A critical theoretical view on AI companionship respectively must ask for the normative consequences of treating social AI as if it were human, which is sketched in a second step. With a view on the embedding conditions of social AI, there are reasons to claim that one probably shouldn’t spend too much time with a robot companion. This opposes the narrative of AICs as adequate substitutes for human forms of relatedness. It is concluded in a third step that the social malaise of epidemic loneliness is the result of attempts to solve non-trivial suffering with AI cognition rather than human recognition. This may be considered ethically misguided if the practice of humanizing AI comes at the expense of a dehumanization of relatedness, which is the conclusion of this analysis.

2 (Digital) loneliness—connected, yet alone!

There are many theories, and multifold interdisciplinary readings, of the notion of loneliness. Weiss (33) has stressed a methodological flaw of loneliness research as follows:

What seems to me wrong with many current “definitions” of loneliness is that they are insufficiently sensitive to loneliness's status as a real phenomenon. (…) They define it by the conditions that might theoretically give rise to it. (…) Other definitions of loneliness suggest a theoretical idea of what is at the heart of loneliness. (…) Actually, these not only are not descriptions, they are not definitions. They are mini-theories. By wrapping together identification of the phenomenon (“this is loneliness”) with an explanation for the phenomenon, they foreclose the critical research question (33).

To avoid literally foreclosing the critical research question, it might be helpful to ask what can be derived from different disciplinary perspectives on loneliness for my reconceptualizing of loneliness as a symptom of distorted social recognition relations. I have suggested an understanding of “loneliness” as a significant change of meaningful relatedness against the backdrop of three anthropological premises: (1) we are (like other animals) relational beings, (2) as such, we need some sort of social recognition (predominantly in the basic forms of mutual empathic understanding and respect), and (3) we can suffer from loneliness, because we are sentient beings [cf. (25, 34)]. This makes it plausible why we often evaluate loneliness as a condition that is impairing or negatively interfering with our wellbeing [see also (35)], and our health respectively, as the notion of wellbeing is essential to an understanding of psychosocial health (3639). Loneliness affects the wholeness of one's self-world relation: it changes the evaluative processes of a person (i.e., its cognitive, affective, volitional patterns of enaction), and therefore changes the way in which someone relates to self, others, and the world: it apparently always takes place in social embedding relationships, i.e., there is no such thing as loneliness without the (embedding) sphere of the social. One can be (physically) alone without feeling lonely, and, moreover, one can be around others, and yet, still feel fundamentally lonely (40, 41). It is apparently the quality of relatedness to others that changes: Loneliness occurs because a relationship does not attain an expected or desired level of quality and significance (42). Thus, it strikingly reveals itself when one is around others. This points to the important difference, but also relation between “external” factors (like physical isolation or being socially excluded from a group), and the inner corresponding feelings or stances regarding these experiences (43, 44). The peculiar double-aspectivity of loneliness as a contact-psychological ambivalent situation is captured in reminding that it is often experienced both as a burden and as a requirement, e.g., it can be of some sort of instrumental value, for instance for personal goals of contemplation or recovery (45). If we place loneliness within the differentiated structure of interpersonal relatedness (46), it acquires its meaning only through the reverberation of past relationships or the anticipation of future ones. What hurts in loneliness is particularly the loss of emotional and intellectual closeness to others, which often reveals experiences of being “socially invisible,” excluded, not appreciated, neglected, being an “outcast,” etc. (47). Loneliness therefore can be called a social pathology because it is an expression of “dysfunctions that violate a society as a whole at the sensitive interface of individuation and social integration” (48). It is a phenomenon that is caused by disturbed symbolic reproduction dynamics of the lifeworld. These become more specifically graspable as the result of failed processes of social integration. Attempts to cure this with AICs might exemplify what “goes wrong” in these digital processes of integration. The philosopher Axel Honneth has described these pathogen dynamics as a “forgetfulness” of social recognition, according to which one can also differentiate forms of social invisibility (49), respective strategies of the “invisibilization” of others, which is one way to explain the dynamics of social disintegration and marginalization of individuals and groups. In principle, the adoption of an objectifying stance can be normatively permissible in many cases, but with his analysis of a “forgetfulness of recognition,” one can focus on the dynamics of reification that often erodes the very preconditions for any trustful intersubjective practice and corrupts basic modes of mutual respect and understanding, thus the prerequisites for ethical practice in the social sphere. Honneth says “this kind of ‘forgetfulness of recognition’ can now be termed ‘reification.’ I thereby mean to indicate the process by which we lose the consciousness of the degree to which we owe our knowledge and cognition of other persons to an antecedent stance of empathetic engagement and recognition” [cf. (50)]. We suffer from loneliness because we perceive instantly what is essentially missing: a relatedness to others in which we experience ourselves as adequately socially recognized. The crucial point of Honneth's theory—therein adding something new to the standard view on loneliness as mere subjective feeling (51)—is that for many loners this goes hand in hand with the suffering from being ostracized, rejected, overlooked, not taken seriously as a participant (e.g., by their families, peers, at the workplace, by authorities), which places loneliness within a broader frame of the reproduction dynamics of social recognition: One must therefore discriminate between either a lack of or expressions of false recognition, which both include (sometimes: intentional) strategies of “invisibilization”: While a lack of social recognition implies a fundamental neglect of the other [an can also include the intention to harm someone through neglect, e.g., by objectifying someone as a mere enemy, as Sticker (52) has described it], false social recognition refers to the instrumentalization of agents by others, i.e., being valued as a mere means to an end for the other, which equally can cut people off from basic social inclusion. This lack of empathy is a “deficiency” mode of social recognition. Interactions that cannot reproduce the respective forms of social inclusiveness and integration can be assessed not only as potentially unethical and/or legally problematic, but also as contributing to a perpetuation of a social malpractice that alienates people and defines the status of loneliness as a socially precarious condition. At the moments when people “cannot conceive of themselves as actively participating and interrelated members of a jointly experienceable society” (48), the (pathogenic) potential of loneliness as both a cause and symptomatically an expression of disturbed recognition relations can be stressed. This negative experience can induce agents to socially withdraw from such social constellations to seek out more satisfying connections, and this, in principle, does not rule out the fact that basic experiences of appreciation are sought and (allegedly) found in relation to social AI. It seems uncontroversial to claim that AICs do “recognize” humans in a technical sense, but my point is to ask whether we can really assume capabilities for recognition. If it is reasonable to accept that loneliness is the kind of alienation experience that evolves out of and is an expression of distorted social recognition relations (graspable as a lack of, or a false recognition practice), and social AI is designed to fill precisely this gap of an experienced lack of relatedness or should even compensate for experiences of false recognition, it is fair enough to question whether AICs, no matter how “sociable” they appear, can really help to overcome loneliness. This would imply that lonely people are placed in a relation that basically allows them to feel connected in the first place. While I believe that AICs can provide this basic feeling of “connectedness,” one can doubt whether these devices sufficiently contribute to the kind of social inclusion that people experiencing loneliness really need. I suspect that we are rather dealing with the digital variant of a fundamental lack of recognition in human–AI companionship, albeit with AICs delivering the perfect illusion of recognition. Ergo there might be the concern that AICs contribute to sustaining loneliness rather than offering a way out of it. But how do the (patho) dynamics of loneliness relate to altered social recognition?

2.1 Human companionship: recognition first!

If “relatedness” is the backdrop for perceiving loneliness as a state in which something is fundamentally “missing,” the inherently evaluative endeavor that is crucial for all dynamics of intersubjective encounters (53, 54) becomes relevant. People are involved in all kinds of interactivity that often gives rise to the particular experience of a “we-feeling” (55, 56) or accompanies experiences of a being-with (57). In the suggested reading here of the notion of loneliness it is not stressed as a “synthetic a priori” of human consciousness [as, e.g., suggested by Mijuskovic (58); for review, see Jacobs (59)], but rather is perceived as a condition that necessarily has to be commemorative of the primordiality of social recognition (50, 60). This primordiality of (affective) relatedness is the ontological and conceptual prerequisite for being able to experience any “lack” of it. This may become more plausible when we stress the developmental aspect of social cognition (6163). Through the recognitional “interactive” modes of imitation, joint attention, as well as affective contagion, we can presuppose a priority of recognition over cognition from a biopsychosocial developmental perspective (24, 64). Consequently, the perception of a lack of social relatedness or suffering from the pain of social disconnectedness would not even be possible without the all-important (basically affective) experiences of previous interactions that impregnate our brains (61). Thus, what some epistemological approaches to loneliness á la Mijuskovic mostly fail to address is these core experiences of intersubjectivity. These are the experiential prerequisites for being able to register an impairment of relational experience. This often is accompanied by an additional feeling or judgment, e.g., that it is (felt as) unpleasant, distressing, impairing. Consistent with this view is that experiences of loneliness play a necessary role in the individuation process, as it has been exemplarily outlined by Winnicott (65) and in other theories that focus on social relatedness and its disorders, respectively (6670). Although loneliness clearly has an affective dimension (we often feel lonely, when we are lonely), it nevertheless is not a distinct emotion. Rather, it is the framing condition for very different emotional episodes to appear (e.g., fear, sadness, forlornness, etc.), and, simultaneously, reveals our affective vulnerability: the suffering from loneliness literally can be nerve-wracking, as it is, after all, associated with a state of emotional distress, which occurs as a reaction to experiences of being alienated or misunderstood, socially rejected, and/or otherwise restricted in opportunities for emotional intimacy with others [cf. (71)]. Neurobiological research moreover associates the processing of experiences of social exclusion in the Anterior Cingulate Cortex where physical pain is processed [cf. (7274)]. This “social pain” hypothesis of loneliness can be further linked to the evolutionary view on its supposable (mal-)adaptive functions. Here it seems that feeling lonely motivates people to seek contact with others (75). By contrast, its maladaptive effects have been discussed along the lines of the abovementioned inherent social exclusion dynamics: It is true that loners often fight a “struggle for social recognition,” i.e., that being lonely implies unequal treatment or a lack of social participation that negatively affects the self-relation of a person (76). It has been shown that loneliness is associated with stigmatization processes (77), experiences of shame (78), and often leads to situation in which loners are intentionally socially shunned because of the very fact that they are lonely [cf. (79)]. This social exclusion of lonely people has been explained with studies on emotional contagion that provide evidence that loneliness “spreads” (80) among even larger populations like a virus. For vulnerable people, this often comes with the experience of social rejection and isolation as their isolation is (sub-) consciously perceived by others who do not offer support to loners but display quite the opposite behavior to “protect” themselves from “catching” it. It is these inherent social dynamics of loneliness as an increasing process (germ. Vereinsamung) by which it manifests as a chronic condition. In chronic loneliness, people are no longer able to assess their condition concerning (rememberable) experiences of closeness, affective resonance, existential security, and closeness to others. This can include a reduced capability to interpret social cues correctly, i.e., it is not solely because they are lonely that people are shunned, but also because they might display altered recognition of others, for instance, due to their anticipation of being rejected (81). Others may distance not only because they may feel overwhelmed by the lonely person's need for recognition, but because they also display socially avoidant behavior, which has been suggested as serving a self-protective function in loneliness (82).

With such an emphasis on the role of intersubjective, particularly inter-affective dynamics, it may appear even more plausible that sentient, relational beings are also affectively responsive toward objects like AICs, even developing a bond to these devices [cf. (83)]. This seems even more likely when x-bots mimic human interactional patterns, which is possible as most AICs come equipped, for instance, with emotion detection that allows to track and to directly “adjust” to the particular moods of people (8486) [for a review, see Spezialetti et al. (87)]. In addition, AICs are often perceived as objects of patience (88) and are treated not as mere “technical devices.” Instead, there is the tendency to perceive them as if they were human, which correlates to the extent AICs appear as human-like (89). It is the relational design of adaptivity to the specific needs of human intersubjective (e.g., communicative) practice that, together with a particular responsivity of AICs, can make people forget that “the subject they have called is (still!) not available.”

That being said, one can now focus on the central question: Can AICs be helpful with loneliness? It seems wise to opt for a pragmatic view: one can stress the benefits and individual experience of feeling less lonely with an AIC, which then can be reassessed against the backdrop of possible negative impacts that this relationship (in the long run) might imply. This, however, does not rule out a more conceptual view: One might aim to answer the question of whether social AI can be adequate or sufficient for supporting people and/or for societal loneliness prevention by focusing on features of AICs that may allow us to assess AICs as either “capable of social recognition” (which respectively also opens up the possibility for false recognition when being related to an AIC) or not capable at all by definition. Whether or not this would imply that we need an extended theory of social recognition, which must include AICs as “artificial agents who are able to care,” is an additional question that may emerge from such an investigation.

2.2 AI companion: the subject you call is (still) not available!

The AIC is in a literal sense a cultural machine, i.e., it is itself a signature of productive and recombining cultural dynamics and is part of the complex social dynamics of digitality that has led to a singularization of the human subject (90). The AIC as technical other—unlike a coffeemaker or vacuum cleaner—apparently triggers our affective involvement. This has been evidenced, for instance, in the case of chatbot-use (9194). It apparently matters most how we perceive our digital companions: it is we who imbue our relations to AICs with some sort of meaning. The fact that we already have started to treat x-bots as if they are human might lead to significant changes to our perception and practice of (adequate) social recognition, particularly as it relates to companionship. Given the different functional roles an AIC can play in a person's life, we might understand better why these objects mean so much to “their” humans: One marries the hologram Hatsune Miku (95)6, others share physical intimacy with love dolls (such as Harmony7), some bury their AI pets, such as Paro8 or their love dolls in a proper farewell ceremony9, or do their work-out with Pepper10, use Winky11 for entertainment and pre-school education of children, or consult Replika12 for a romantic conversation. A lot of people find something (exclusively) in x-bots that either is evaluated as having some sort of additional value or beneficial effect for their lives, sometimes even because they can share with the x-bot something they would never address in relation to people. AICs may even be experienced as preferable over humans with respect to performance qualities and it might be especially the artificiality—the “as-if”—that makes AICs attractive to humans.

So, why not treat AICs as being proper “(re-)productive sources” of social recognition? This still appears as puzzling, if we spare a second to remind ourselves how the sociability of humans fundamentally differs from “companionship” with a digital device: It seems that AICs lack basically everything that is substantial to social recognition as it has been introduced here. The asymmetry that impregnates the human–x-bots relation seems striking: (1) While human relatedness is characterized by the intersubjective dynamics of mutual social (re-)cognition, even the most “sociable” designed AI companion to date lacks intersubjectivity (albeit it normally is capable of some kind of inter-action and might even be ascribed self-referentiality). There is simply no “subject” that then would be capable for 291 a vital “inter –” beyond the mere “-activity” in the sense of a mutual recognition relation. It therefore still seems reasonable not to ascribe robots consciousness [cf. (96)], or an intentional self-relation [cf. (97)]. This alone seems the knock-out criterion for perceiving AI cognition in any form as equal, or adequate, for substituting human social (re)cognition practice. Nota bene: with this it is not said that they cannot minimally contribute to social recognition relations (which I believe is possible in some cases and is sketched in what follows). Quite to the contrary, others speak of AI “consciousness” in much more than a metaphorical way and draw strong analogies between the human consciousness and, for instance, the algorithmic activities of social AI (98100). (2) Human praxis is the inter-affective praxis of sentient beings, which is dialogic in nature and encompasses all forms of (e.g., symbolic, physical, etc.) exchange, while an AIC simply traces and tracks emotional expressions, and/or mimics or triggers our emotions without having any sentient capacity. However impressive the recent developments of affective computing (101) and responsiveness 102, 103) might be, a robot capable of caring13 cannot be assumed (yet), even if these devices actually “do” care in a technical sense (104) and are therefore of instrumental value [cf. (105)] in particular fields of application [for questions of robot liability in care practice, see Beck et al. (106)]. (3) AICs are (inter)active machines, but not organically (vital) social entities. Although AICs, as forms of embodied cognition, possess “striving” and even spontaneity—due to the respective functional design of a binary code—there is no such thing as a conatus or freedom involved (which, when taken from one, elicits an essential suffering). (4) Some would even say that AICs neither have autonomy, nor any sort of agency that would come near to that of self-reflexive beings. An AIC is autonomous to run on its program if it has an engine, while persons need more than energy, namely, liberty for their autonomous self-actualization. This is the reason that social AI consequently can be considered a-moral, even if the design includes some kind of compatibility with ethical standards and implies rule following, according to which some perceive of AICs as moral agents [cf. (107)]. Capurro (108) reminds that it is a dilution of the concept of morality if we assume that just because any agent can cause some good or its opposite that this necessarily implies some sort of moral accountability. Indeed, normally (if not otherwise intended by design) it is guaranteed that certain harm-norms are not transgressed by the AIC itself. Exceptions to this rule might be x-bots that could also be instrumentalized for harming (109, 110), or specific malware, which, for instance, can turn the chatbot Alexa into a “Malexa” (111). And finally, it can be doubted that mere pattern recognition, in a technical sense, is equivalent to that kind of ethical dimension that impregnates the term of social recognition that I have stressed here.

Nota bene: There are many ways of conceptually “dragging the soul into the machine” by mere definition. Normally this is done by using such criteria or specific readings of “capabilities” that then become re-conceptualized as structural relata to human “capacities” or a criterial definition of personhood, to humanize AIC in general, or to let them appear as somehow sufficiently capable of social recognition, in particular, because of a “match” or some strong conceptual analogies. I think this debate is basically (still) a matter of “belief” as robot consciousness is not (yet, to my knowledge) evidencable. I therefore suggest adopting a pragmatic view and counting in also the pro-arguments for AICs as potentially contributing to the experience of social recognition: It could be simply accepted that human–object relations include libidinous investments and emotional attachment (112) to non-vital objects, too. The responses of AICs (think of highly advanced social x-bots, such as Sophia14) could probably be subjectively experienced and judged as “adequate,” particularly when the device contributes to achieving a particular human good; for instance: when it somehow beneficially offers someone attention or helps to prevent harm. If people can experience themselves as adequately recognized (i.e., loved, respected, truly seen, supported, desired, etc.) by a robot, or trust them (113, 114), we might have reason to consider a further extension of the sphere of recognition relations that somehow integrates this influence of AICs in loneliness treatment and prevention. We might not deal with an inter-affective mode of relatedness, but the possibilities that x-bots offer, in principle, simply cannot be ignored regarding loneliness management in the future. To give an example: The pathogen effects of social isolation in combination with a genetic vulnerability triggers the onset of schizophrenia and paranoia, as it has been exemplarily described in the case of a lack of contact-paranoia (115), especially in older people. So even the possibility of having a conversation or feeling that “some-thing” is around, might allow chronic loners to train their communicative skills, which could be helpful in preventing the onset or further development of the pathodynamics of loneliness. Such minimal contributions of AICs could be valued; however, AI companionship that cannot be used as a medium to connect to other fellow beings—to my opinion—is still monologic in nature: we are actually (still) talking to ourselves! I have elsewhere called this the echo chamber scenario of AI companionship (24) that reveals the basic dilemma: there might be some sort of beneficial effect (the surface phenomenon of loneliness might be “eased”), but this remains potentially problematic, as the very basic condition that should be ideally altered is simply reproduced: we are connected, but alone! And even if certain criteria can be ascribed to AICs in analogy to human capacities, I would (to date) still rule out intersubjective capacities in AICs, which I see—particularly with respect to the importance of inter-affective resonance and self-reflexivity for humans—as the basis for true recognition relations. I believe this is exactly what lonely people need to break free from their loneliness, but I would also accept that there are some people that perceive their exchange with an AIC as so fulfilling that they no longer feel alone. Another point is whether to assess this as a case of proper social integration, which I do not believe is necessarily given, even in scenarios in which someone is just fine with an AIC. It seems quite ironic to try to “fix” suffering from an altered relatedness to fellow-beings with a tool that in the best case just appears as human. The AIC–human relatedness is Janus-faced: one can certainly emphasize the instrumental value of social AI, but still must problematize the potential detrimental effects of AICs, at least if the interventionist ideal would be that people really “break free from loneliness.” But what does this mean, and are there probable social AI scenarios that might be preferable to the type of (one-to-one) human–x-bot relation stressed so far?

2.3 How to leave loneliness?

We are intrigued by the illusion of social recognition as the AIC (un)cannily “simulates” affective attunement and understanding, and is often perceived as non-judgmental and accepting, which might be a balm to the wounded soul of people who feel neglected, misjudged, unappreciated by others, etc. This exact “comfort” might be a problem of AIC–human relationships, as on the surface some individual symptoms are “eased,” but people are still “objectively” lonely. By contrast, one might see the opportunity to enhance one's situation in relation to the AIC even if we are dealing with a “fake” recognition scenario. Coping with one's loneliness all alone might even be seen as an authentic expression of (digital) autonomy as this could demonstrate that someone is “in control” of their loneliness. From a psychological perspective, however, we must then be carefully reminded that autonomous enaction of individuals is always intertwined with the material reality of intersubjective practice, and that this is the decisive realm of agent autonomy. So, it can be assumed that as long as the psychic reality of a person stays connected to the material reality and their horizon of reasoning with other people, being attached to an AIC does not impair agency, although there is always a potential to “fall for the machine”, inasmuch as AICs can be very appealing in showing “unconditional” appreciation and uncritical affirmation, which is constantly available to the user. Some transformative possibilities that may come with loneliness as an existentially challenging experience might be restricted for those who have an especially strong attachment to their beloved object (116) and may relapse into a “forgetfulness of social recognition,” i.e., of how important relations to fellow-beings are, even if this includes unpleasant experiences.

AI solutions come with different mediating and transformative potential that enable loners to get in contact with real people. I have claimed elsewhere (24) that x-bots should be designed in the future to serve this transitional function, i.e., helping to facilitate the transition from a position of social exclusion toward an active positioning of oneself (as a loner) in relation to others. Similarly to how one wears a cast for a few weeks to heal a broken bone, the use of an AIC could be restricted to prevent getting trapped within digital loneliness, i.e., a “fake” recognition scenario. A good example for AI as a medium to “leave loneliness behind” is the location-based reality game Pokémon Go, which provides a digital community with real road maps to go outside and hunt Pokémon. It has been shown that even chronic loners who have drastically socially withdrawn even for several years have become motivated to leave their digital loneliness to go outside and play with others [cf. (117, 118)]. Virtual reality spaces can motivate people to seek out relatedness with others and effective coping with loneliness seems to be much more likely when the opportunity to meet fellow-beings is given. This has been demonstrated in an Avatar Mediated Conversation setting (119), which users have evaluated as supportive in thematizing sensitive topics, such as their own loneliness experiences. These examples might not be the ultimate guarantee that one finally overcomes loneliness, but they offer much more possibilities for it to happen than being alone with an AIC. Of course, this depends again on the quality of the digital relationships; hence, it is crucial that symptomatic patterns of distorted recognition are not simply repeated in the virtual space. This will now be sketched to finally stress the critical potential of digital loneliness:

3 The critical potential of loneliness—the new precariat

The idea of loneliness as a sociopathological condition is compatible with the sociological and culture-analytic views on loneliness as mirroring a “defect in social relations” [cf. (120)] that cannot be combated with mere digital connectedness (121), and which retrospectively has impregnated the idea of loneliness as an “inner homelessness” (122) or “mental isolation” for which a “lack of individual ties and interpersonal shared values,” in particular, are essential (123). An active withdrawal from society to voluntarily isolate—or better: to distinguish oneself from others [e.g., as it is essential for Friedrich Nietzsche's loneliness concept, e.g., Nietzsche ZA1883-5, III (124)]—would allow one to perceive loneliness as a “heroic” mode of self-appropriation. However, these are very exceptional cases of assessing loneliness as a “valuable” form of social distancing.

Loners often must deal with not receiving even the most fundamental forms of social recognition, and therefore, often also withdraw to a kind of digital parallel universe (the “echo chamber” of AI): This is, for instance, the case with the so-called hikikomori (jap. 引きこもり) in Japan. The psychologist Tamaki (125), who coined the term, refers to this mode of loneliness as an abnormal avoidance of social contact that literally translates as “being confined.” Initial findings before the year 2000 showed that mostly young people were affected, while recently there has been a marked increase in hikikomori among the middle-aged and elderly people living in extreme levels of isolation, staying in the same room for a period of at least 6 months and refusing to leave the house. The digital contact to the outer world is maintained, but one's overall living situation is perceived as unsatisfying and depressing (126), in extreme examples resulting in the phenomenon of kodoku-shi (jap. 孤独死) (118), i.e., the solitary death. Apparently, the pressure that comes with societally up-held ideals, which are already impregnated by “system imperatives” (e.g., ideas of constant growth, development, and what counts as success in life, etc.) can lead people to completely withdraw from society and to become loners. Considering the classic alienation paradigm of Critical Theory, the digitalization of social companionship then is a systemically induced change of the lifeworld, in which system imperatives (“the digital agenda”) lead to an inner colonialization (127) of the sphere of the lifeworld, thereby potentially destabilizing important dynamics of social integration processes; mostly when economic imperatives infiltrate all eras of the public sphere, which then shows in the social miseries that are based on acceleration, reification, commodification, and therefore can be counted as expressions of alienation [see also Fraser et al. (128)]. Certain success goals whose significance and attraction are symbolically universalized within a cultural we-group may be prone to particular (misleading) readings of ideals, which has been exemplarily described by Ehrenberg (129) as a cause for chronic “exhausted societies” in which individuals suffer from depression and burnout. The same dynamics could also be the reason for social isolation and the increase of loneliness. If certain ideals that predetermine the success goals one “has to reach” [think e.g., of “perfectionism” (kodawari; jap. こだわり) or continual improvement (kaizen; jap. 改善), which are highly ranked norms not only in Japan] are perceived by a growing number of (younger) people as something that cannot be obtained or achieved, this most likely fosters tendencies of social disintegration, exclusion, and stigmatization, which can lead straight to loneliness. Merton (130) specified such criteria for success as causal factors for anomistic tendencies in societies. He particularly highlights the role of societal narratives, which often do not sufficiently communicate the fact that certain success goals and the means to achieve them simply cannot be acquired and reached by everyone, given the systemically induced inequality structures and limited resources. While some people actively socially withdraw from the pressure of success goals, others become chronically lonely according to different recognition struggles that come with age, or physical and mental impairments. What unites these groups is the struggle of being recognized as authorized recipients of certain forms of (societal) support (by family, peers, certain organizations, or governmental agencies). In addition to the monetary costs that loneliness causes (e.g., for the psychosocial healthcare sector) there is probably a much higher debt to pay with respect to the role that loneliness plays as a symptom of restricted social participation: the dynamics of collective re-politization, for instance, in the way of radicalization, must also be addressed alongside the notion of digital loneliness. This “discontent” with the analog world and/or within certain digital cultures—the critical potential of loneliness—is revealed in aggression. An example is the explicit misogyny in the Incel-community in which a gross majority of people (predominantly identified as male) are lonely, and thematize their experience of a lack of desired relatedness (131). This is just one among many other forms of (cultural) psychological reaction formations that can foster the social dynamics of ostracizing others, fanaticism, pathological (group) hate, and violence for which “digital loneliness” is the point of departure. This is the dark side of loneliness: the envy, jealousness, hatred, or revenge fantasies, which Freud (132) would have grouped as aggressive relapses under the diagnostic category of culture hostility (dt. Kulturfeindseligkeit). They are directed by “unrecognized loners” particularly toward those groups or individuals that supposedly receive all the precious recognition—and the associated resources—that one would like to have for oneself. This is the negative counterpart to a silent (digital) suffering from loneliness and poses another important ethical problem—besides the risk that loners might get trapped in fake recognition scenarios with social AI—that must critically be assessed by addressing distorted recognition as an institutionalized practice for intersubjective exchange in certain (digital) niches.

It can be concluded that loneliness inevitably becomes socially visible, particularly the digital realms in which it can “flourish”—may it be in terms of resentment or in terms of depression and hopelessness about the very condition of feeling and being socially disadvantaged (133). If loneliness leads to anomic states that are intersubjectively consensually evaluable as harmful or distressing not only for those directly affected, but also for others, societies with growing rates of loneliness should be concerned about whether social AI, particularly AICs, really are sufficient tools to deal with these socially precarious dynamics. Despite the enthusiasm about the future possibilities that will come with further humanizing AI, a critical (not pessimistic) view on AI companionship must accompany it if there is something correct about my claim that social AI plays a causally explanatory role for the growing problem of and the specific non-trivially harmful forms of loneliness in recent societies. The idea that digital connectedness automatically positively alters precarious situations of loneliness is dismantled as a myth. In the future, the x-bot, the AIC—the artifact that is treated as if it is human—will continue to uncannily intrude into our lifeworld in which loneliness counts as the new precariat. This points to the risks of the AI agenda: The very aim of trying to solve loneliness with AI cognition appears misled with respect to the duties that come with the ethics of social recognition.

4 Conclusion

The aim of this analysis was to provide some reason to accept that the lifeworldly logics of recognition cannot be easily substituted by AI cognition. I have adopted a culture-reflexive, critical view on digital loneliness (management). If loneliness is basically a (collective) cry for the need for social recognition, it seems that (to date) AI cognition cannot replace, substitute, or compensate for human recognition in such a way that makes it ethically justifiable to further outsource the (re-)production conditions of social integration to AI. Loneliness refers to the state in which one is connected to others or related to an AIC, but still is deprived of meaningful (i.e., a specific qualitative) relatedness in terms of empathy, mutual respect, und understanding. In those digital scenarios where this is actually experienced with an x-bot, one can pragmatically accept that there are cases in which AICs might offer social recognition to people, although it has simultaneously become stressed that this might be the point of departure for the manifestation of chronic loneliness, which must be further determined by empirical research.

Understanding loneliness requires us to ethically tackle both the risk of becoming even lonelier in a monologue staged by a machine that is designed to let us forget about its echo chamber nature, and the risk of the reproduction of loneliness in being exposed to “active forgetfulness” by our fellow beings. To ensure it does not become a cultural trap, but a possible cure, what I suggest predominantly matters is not how advanced the x-robots are, but how advanced a society is in tracking and preventing deficiencies or false recognition as structural deficits that cause social maladies such as loneliness. This critique of established social narratives on “social” AI points out that well-digitalized societies should invest in humane institutions in the first place to create adequate ethical embedding conditions for any further institutionalization of humanoids.

Author contributions

KJ: Writing – original draft.

Funding

The author declares that no financial support was received for the research, authorship, and/or publication of this article.

Acknowledgments

I would like to express my gratitude to the reviewers, the editors, and to Nicholas Parnell Pimlott for proofreading.

Conflict of interest

The author declares that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

Footnotes

1The term “social x-bots” “can be used to refer to all types of digital “companions” (hereafter also referred to as AICs) that can be integrated into or are designed to adapt to human relational dynamics (e.g., specific communicative settings) and show differences in terms of the specificity of embodiment, interface capabilities, and system coupling. I am aware that there are plenty of different types of robots such as Winky, Aibo, MiRo, Paro, EmotiRob, Pepper, Dinsow, ElliQ, Atlas, Asimo, Harmony, LOVOT, or conversational agents or chatbots such as ELIZA, Alexa, XiaoIce, Replika, Tess, Woebot, and Wysa, etc. Relevant for this analysis is that all these x-bots create illusions of intersubjective/inter-affective exchange, but might still not be seen as adequate to solve the problem of a lack of meaningful relatedness in loneliness.

2Japan’s annual birth rate fell below 1 million in 2016 for the first time since 1899, leading to a boom of social AI in the form of care-assisting robots for dealing with the high gerontification. According to data from the Ministry of Internal Affairs and Communications, the number of people over 65 has risen to 36.4 million. Japan's Ministry of Economy, Trade and Industry forecasts that the robot service industry will grow to nearly $4 billion annually by 2035, which is 25 times its current value (09/19/2021stat.go.jp; accessed July 28, 2023).

3Non-trivial refers to suffering from loneliness in a (social) pathological relevant sense as distinct from other forms of suffering from it that do not imply the biopsychosocial malfunctioning associated with loneliness as a persistent (“chronic”) condition.

4The Demographic Change and Healthy Ageing Unit of the WHO has announced the UN Decade of Healthy Ageing (2021–2030) and addresses social isolation and loneliness as one of the most pressing topics of health promotion, particularly for the elderly population also under the auspices of digital interventions (such as skills training, community and support groups, and cognitive behavioral therapy) that have been developed to reduce social isolation among older people, with the aim of improving access to information and communication technologies in order to create a more age friendly community, which comes with an interventionist claim that seem to have been overlooked for a long time.

5Language […] has created the word “loneliness” to express the pain of being alone. [A]nd it has created the word “solitude” to the glory of being alone (Tillich 1963, p. 17).

6The Japanese mangaka and illustrator Kei Garo Miku Hatsune designed the virtual character Hatsune Miku on behalf of Crypton Future Media, and the company Gatebox uses this character for their companion chatbot. Akihiko Kondo officially married the virtual figure Hatsune Miku. See https://www.otaquest.com/hatsune-miku-gatebox-marriage/ (accessed November 25, 2022).

7Harmony is a RealDoll companion robot that allows customers to create their doll by choosing among 10 “persona points” to customize it according to individual preferences. https://www.althumans.com/companion-robots/real-doll.html (accessed August 17, 2023).

8Paro is an AI pet that resembles a Canadian Harp seal pup. It has been in use in nursing facilities in Japan since 2003. It responds to tactile stimuli and recognizes temperature, posture, and light.

9Leiya Arata is a founder of the Love Doll Funeral services in Osaka. For $800 customers can have a mannequin memorial. Presiding over the mannequin memorials is the Buddhist monk Lay Kato. There is also fare-well ceremony for AI pets: When Sony announced in 2014 that they would no longer support updates for AIBO, the community of owners began sharing tips on providing care for their digital friends in the absence of official support.

10Pepper is called a “semi-humanoid” robot manufactured by SoftBank Robotics (formerly Aldebaran Robotics). Its crucial feature is emotion detection based on face and voice tone analysis. Pepper simulates active listening with arm and hand gestures, which creates the appearance of having self-awareness. It was introduced in June 2014 in Japan.

11Winky is a play-bot produced by the company Mainbot and comes equipped with a microphone, sensors, a speaker, LEDs, a rotating head and ears, a motion and distance detector, and a gyroscope for interacting with the environment.

12Replika is a generative AI chatbot that was released to the public in 2017 and within 1 year was used by 2 million people. The user must answer a series of questions to create a network that serves as contextual frame for the “friendship”—including romantic and, before the function was disabled by the developers in 2023, erotic relationships.

13“Caring” is not a solely morally neutral concept but is bound to ideas about the good life. To count as caring, the design of x-bots would have to include a reflexive self-relation which manifests itself in acts of caring as moral self-realization. Even if an AIC is responding to its object of care, there is no “knowing” as it has no idea of its own wellbeing or the wellbeing of others. But it is exactly this that qualifies caring as the mode of being a moral person.

14Hanson Robotics has developed Sophia. In 2017, it was given citizenship and was the first non-human to be given a United Nations title (UND Programme's first Innovation Champion).

Publisher's note

All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article, or claim that may be made by its manufacturer, is not guaranteed or endorsed by the publisher.

References

1. Honneth A. Pathologies of the social: the past and present of social philosophy. In: Rasmussen DM, editor. Handbook of Critical Theory. Cambridge, MA: Blackwell (1996). p. 369–98.

2. Jacobs KA, Kettner M. Zur Theorie “sozialer Pathologien” bei Freud, Fromm, Habermas und Honneth. In: Clemenz M, Zitko H, Büchsel M, Pflichthofer D, editors. Interdisziplinäres Jahrbuch für Psychoanalyse und Ästhetik, Band 4. Gießen: IMAGO (2017). p. 119–46.

3. Drageset J. The importance of activities of daily living and social contact for loneliness: a survey among residents in nursing homes. Nordic college of caring sciences. Scand J Caring Sci. (2004) 18(1):65–7. doi: 10.1111/j.0283-9318.2003.00251.x

PubMed Abstract | Crossref Full Text | Google Scholar

4. Adler NE, Boyce T, Chesney MA, Cohen S, Folkman S, Kahn RL, et al. Socioeconomic status and health. The challenge of the gradient. Am Psychol. (1994) 49(1):15–24. doi: 10.1037//0003-066x.49.1.15

PubMed Abstract | Crossref Full Text | Google Scholar

5. Orth-Gomer K, Rosengren A, Wilhelmsen L. Lack of social support and incidence of coronary heart disease in middle-aged Swedish men. Psychosom Med. (1993) 55(1):37–43. doi: 10.1097/00006842-199301000-00007

PubMed Abstract | Crossref Full Text | Google Scholar

6. Thurston RC, Kubzansky LD. Women, loneliness, and incident coronary heart disease. Psychosom Med. (2009) 71(8):836–42. doi: 10.1097/PSY.0b013e3181b40efc

PubMed Abstract | Crossref Full Text | Google Scholar

7. Wilson RS, Krueger KR, Arnold SE, Schneider JA, Kelly JF, Barnes LL, et al. Loneliness and risk of Alzheimer disease. Arch Gen Psychiatry. (2007) 64(2):234–40. doi: 10.1001/archpsyc.64.2.234

PubMed Abstract | Crossref Full Text | Google Scholar

8. Cacioppo JT, Ernst JM, Burleson MH, McClintock MK, Malarkey WB, Hawkley LC, et al. Lonely traits and concomitant physiological processes: the MacArthur social neuroscience studies. Int J Psychophysiol. (2000) 35(2–3):143–54. doi: 10.1016/s0167-8760(99)00049-5

PubMed Abstract | Crossref Full Text | Google Scholar

9. Vanhalst J, Klimstra TA, Luyckx K, Scholte RHJ, Engels RCME, Goossens L. The interplay of loneliness and depressive symptoms across adolescence: exploring the role of personality traits. J Youth Adolesc. (2012) 41(6):776–87. doi: 10.1007/s10964-011-9726-7

PubMed Abstract | Crossref Full Text | Google Scholar

10. Cacioppo JT, Hughes ME, Waite LJ, Hawkley LC, Thisted RA. Loneliness as a specific risk factor for depressive symptoms: cross-sectional and longitudinal analyses. Psychol Aging. (2006) 21(1):140–51. doi: 10.1037/0882-7974.21.1.140

PubMed Abstract | Crossref Full Text | Google Scholar

11. Janzarik W. Über das Kontaktmangelparanoid des höheren Alters und den Syndromcharakter schizophrenen Krankseins. Der Nervenarzt. (1973) 44(10):515–26.4765908

PubMed Abstract | Google Scholar

12. Jacobs KA. “Loneliness and the Critical Theory of Big Data”; Network Dialogues 5 Philosophy of the Digital “Connected and Yet Alone?” Literature Forum at the Brecht-Haus, Berlin, 20.06.2019, supported by Philosophie Magazin Deutschland (2019). Available online at: https://lfbrecht.de/event/vernetzt-und-doch-allein/ (accessed August 19, 2019).

13. Buecker S, Maes M, Jaap JA, Dennissen A, Luhmann M. Loneliness and the big five personality traits: a meta-analysis. Eur J Pers. (2020) 34:8–28. doi: 10.1002/per.2229

Crossref Full Text | Google Scholar

14. Amichai-Hamburger Y, Ben-Artzi E. Loneliness and internet use. Human Behaviour. (2003) 19(1):71–80. doi: 10.1016/S0747-5632(02)00014-6

Crossref Full Text | Google Scholar

15. Cassidy J, Asher SR. Loneliness and peer relations in young children. Child Dev. (1992) 63(2):350–65. doi: 10.2307/1131484

PubMed Abstract | Crossref Full Text | Google Scholar

16. Lalayants M, Price JD. Loneliness and depression or depression-related factors among child welfare-involved adolescent females. Child Adolesc Soc Work J. (2015) 32(2):167–76. doi: 10.1007/s10560-014-0344-6

Crossref Full Text | Google Scholar

17. Asher SR, Parkhurst JT, Hymel S, Williams GA. Loneliness and peer relations in childhood. In: Asher SR, Coie JD, editors. Peer Rejection in Childhood. Cambridge: Cambridge University Press (1990). p. 253–73.

18. Thomas S. Einsamkeitserfahrungen junger Menschen—nicht nur in Zeiten der Pandemie. Soz Passagen. (2022) 14:97–112. doi: 10.1007/s12592-022-00415-7

Crossref Full Text | Google Scholar

19. Findlay RA. Interventions to reduce social isolation amongst older people where is the evidence? Ageing Soc. (2003) 23(5):647–58. doi: 10.1017/S0144686X03001296

Crossref Full Text | Google Scholar

20. Srinivasan S, O'Fallon LR, Dearry A. Creating healthy communities, healthy homes, healthy people: initiating a research agenda on the built environment and public health. Am J Public Health. (2003) 93(9):1446–50. doi: 10.2105/ajph.93.9.1446

PubMed Abstract | Crossref Full Text | Google Scholar

21. World Health Organization (WHO). World report on ageing and health (2000). Available online at: https://www.who.int/teams/social-determinants-of-health/demographic-change-and-healthy-ageing/social-isolation-and-loneliness (accessed November 28, 2022).

22. Mann F, Wang J, Pearce E, Ma R, Schlief M, Lloyd-Evans B, et al. Loneliness and the onset of new mental health problems in the general population. Soc Psychiatry Psychiatr Epidemiol. (2022) 57:2161–78. doi: 10.1007/s00127-022-02261-7

PubMed Abstract | Crossref Full Text | Google Scholar

23. Jacobs KA. The depressive situation. Front Theor Psychol. (2013) 4:429. doi: 10.3389/fpsyg.2013.00429

Crossref Full Text | Google Scholar

24. Jacobs KA. (Nothing) human is alien—AI companionship and loneliness. In: Posatti LM, editor. Humanizing Artificial Intelligence. Berlin: Walter de Gruyter (2023). p. 51–70. doi: 10.1515/9783111007564-0

25. Jacobs KA. Loneliness—a change of meaningful relatedness and social recognition. In: Lecture for the Center Human Nature, Artificial Intelligence, and Neuroscience. Sapporo: Hokkaido University (2023). p. 1–20.

26. Tillich P. The Eternal Now. New York: Scribner’s (1963). Available online at: https://www.semanticscholar.org/paper/The-Eternal-Now-Tillich/0cd62af3915fbde491db06f60a41013566202c49 (accessed February 6, 2024).

27. Satorius M. Die hohe Schule der Einsamkeit. Von der Kunst des Alleinseins. Gütersloh: Gütersloher Verlagshaus (2006).

28. Poschardt U. Einsamkeit. Die Entdeckung eines Lebensgefühls. München: Piper Verlag (2007).

29. Illouz E. Warum Liebe weh tut. Eine soziologische Erklärung. Frankfurt am Main: Suhrkamp (2016).

30. Rosa H. Beschleunigung als Entfremdung. Berlin: Suhrkamp (2019). [Engl. Rosa H. Alienation and Accelaration. Towards a Critical Theory of Late-Modern Temporality. NSU Press (2010)]. doi: 10.1080/2156857X.2015.1047596

31. Kucklick C. Die granulare Gesellschaft. Wie das digitale unsere Wirklichkeit auflöst. München: Ullstein (2016).

32. Turkle S, Taggart W, Kidd CD, Dasté O. Relational artifacts with children and elders: the complexities of cybercompanionship. Conn Sci. (2006) 18(4):347–61. doi: 10.1080/09540090600868912

Crossref Full Text | Google Scholar

33. Weiss RS. Reflections on the present state of loneliness research. J Soc Behav Pers. (1987) 2:1–16.

Google Scholar

34. Jacobs KA. Loneliness from an interdisciplinary perspective. What does it mean to be human? In: Lecture for the Center of Artificial Intelligence, Human Nature and Neurosciences. Sapporo: Hokkaido University (2022).

35. Murphy PM, Kupshik GA. Loneliness, Stress and Well-Being. A Helpers Guide. London: Tavistock/Routledge (1992). doi: 10.4324/9780203136263

36. Jacobs KA. Reframing loneliness with critical theory. In: International Conference: Disordered Worlds – Self, Person, and the Social in Philosophy of Psychiatry; 27 April 2017, Stockholm, Sweden. Stockholm: Södertörn University (2017).

37. Jacobs KA. Loneliness from psychological, philosophical, and spiritual perspective. In: 10th Summer Academy for Integrative Medicine; 2019 August 9–18. Germany: University of Witten/Herdecke (2019).

38. Jacobs KA. 2019. In der Welt allein mit anderen sein – philosophische Perspektiven auf Einsamkeit (Transl. Loneliness makes sick – being in the world with others – philosophical perspectives on loneliness). In: 28th Hofgeismarer Psychiatrie-Tagung; 2020 Feb 19. Evang. Hofgeismar: Tagungsstätte Hofgeismar (2019). Available online at: https://cloud.akademie-hofgeismar.de/2019/115945.pdf (accessed January 26, 2024).

Google Scholar

39. Jacobs KA. “The phenomenality and psychopathology of loneliness”; psychopathology and clinical phenomenology of loneliness. Presented at Loneliness makes sick, and sickness makes lonely, Vitos Clinic Bad Emstal; 2019 Mar 18, Vitos Clinic Bad Emstal (2020).

40. Zimmermann JG. Über die Einsamkeit., Vol. 2. Troppau: Bey Christian Gottlieb Schmieder (1786).

41. Maduschka L. Das Problem der Einsamkeit im 18. Jahrhundert, insbesondere bei J. G. Zimmermann. Forschungen zur neueren Literaturgeschichte, Bd. LXVI, begründet von Franz Muncker. Weimar: Alexander Duncker (1932).

42. Peplau LA, Caldwell MA. Loneliness: a cognitive analysis. Essence. (1978) 2(4):207–20.

Google Scholar

43. Wood MM. Paths of Loneliness: The Individual Isolated in Modern Society. Chichester, West Sussex: Columbia University Press (1953). doi: 10.7312/wood92248

44. Leigh-Hunt N, Bagguley D, Bash K, Turner V, Turnbull S, Valtorta N, et al. An overview of systematic reviews on the public health consequences of social isolation and loneliness. Public Health. (2017) 152:157–71. doi: 10.1016/j.puhe.2017.07.035

PubMed Abstract | Crossref Full Text | Google Scholar

45. Oppen vD. Einsamkeit als Last und Bedürfnis. In: Bittner W, editor. Einsamkeit in medizinisch-psychologischer, theologischer und soziologischer Sicht. Stuttgart: Klett-Verlag (1967). p. 104–10.

46. Brinkmann D. Der einsame Mensch und die Einsamkeit. Ein Beitrag zur Psychologie des Kontakts. Psychologische Rundschau. (1951) 3:21–30. doi: 10.1007/BF03374675

Crossref Full Text | Google Scholar

47. Jacobs KA. Roboter gegen Einsamkeit? Zur Reproduktionsdynamik falscher und mangelnder Anerkennung durch “soziale” KI, xx-xx. In: Adophi R, Alpsancar S, Hahn S, Kettner M, editors. Philosophische Digitalisierungsforschung. Zum Wandel von Verständigung, Verantwortung, Vernunft und Macht. Darmstadt: Wissenschaftliche Buchgesellschaft (2024).

48. Honneth A. Die Krankheiten der Gesellschaft. Annäherung an einen nahezu unmöglichen Begriff. Westend. (2014) 1:45–60.

Google Scholar

49. Honneth A. Unsichtbarkeit. Stationen einer Theorie der Intersubjektivität. Frankfurt am Main: Suhrkamp (2003).

50. Honneth A. Reification and recognition. In: Jay M, editor. Reification: A New Look at an Old Idea. Oxford: Oxford UP (2018). p. 17–94. p.40–52. Originally delivered as the Tanner Lecture in Human Values (Berkeley, CA). Spring 2005.

51. Peplau LA, Perlman D. Perspectives on loneliness. In: Peplau LA, Perlman D, editors. Loneliness: A Sourcebook of Current Theory, Research, and Therapy. New York: Wiley (1982). p. 1–18. doi: 10.2307/2068915

52. Sticker M. Poverty, exploitation, mere things and mere means. Ethic Theory Moral Prac. (2023) 26:191–207. doi: 10.1007/s10677-021-10238-9

Crossref Full Text | Google Scholar

53. Fuchs T, De Jaegher H. Enactive intersubjectivity. Participatory sense-making and mutual incorporation. Phenom Cogn Sci. (2009) 8:465–86. doi: 10.1007/s11097-009-9136-4

Crossref Full Text | Google Scholar

54. Schilbach L, Timmermans B, Reddy V, Costall A, Bente G, Schlicht T, et al. Toward a second-person neuroscience. Behav Brain Sci. (2013) 36(4):393–414. doi: 10.1017/S0140525X12000660

PubMed Abstract | Crossref Full Text | Google Scholar

55. Amodio DM, Frith CD. Meeting of minds. The medial frontal cortex and social cognition. Nature. (2006) 7(4):268–77. doi: 10.1038/nrn1884

Crossref Full Text | Google Scholar

56. Gallotti ML, Frith CD. Social cognition in the we-mode. Trends Cogn Sci (Regul Ed). (2013) 17(4):1–6. doi: 10.1016/j.tics.2013.02.002

Crossref Full Text | Google Scholar

57. Heidegger M. Being and Time. Translated by John Macquarrie and Edward Robinson (1962). San Francisco: Harper & Row). Available online at: https://www.researchgate.net/publication/335060851_Heidegger_M_1962_Being_and_Time7 (accessed February 24, 2024).

58. Mijuskovic BL. Loneliness in Philosophy, Psychology, and Literature. Bloomington: iUniverse (2012). Available online at: https://www.iuniverse.com/en/bookstore/bookdetails/357629-Loneliness-in-Philosophy-Psychology-and-Literature (accessed February 24, 2024).

59. Jacobs KA. Loneliness in philosophy, psychology, and literature by Ben Lazare Mijuskovic, iUniverse 2012. Metapsychology Online. (2013) 17(39). Available online at: http://metapsychology.mentalhelp.net/poc/view_doc.php?type=book&id=6978 (accessed February 24, 2024).

Google Scholar

60. Honneth A. Verdinglichung. Eine anerkennungstheoretische Studie. Um Kommentare von Judith Butler, Raymond Geuss und Jonathan Lear erweiterte Ausgabe. Frankfurt am Main: Suhrkamp (2015).

61. Fuchs T. The brain—a mediating organ. J Conscious Stud. (2011) 18(78):196221.

Google Scholar

62. Papoušek H, Papoušek M. Beyond emotional bonding: the role of preverbal communication in mental growth and health. Infant Ment Health J. (1992) 13:43–53. doi: 10.1002/1097-0355(199221)13:1%3C43::AID-IMHJ2280130108%3E3.0.CO;2-R

Crossref Full Text | Google Scholar

63. Colombetti G. The Feeling Body. Affective Science Meets the Enactive Mind. Cambridge, MA: MIT (2014). Available online at: https://academic.oup.com/mit-press-scholarship-online/book/13764 (accessed August 18, 2023).

64. Jacobs KA. The concept of narcissistic personality disorder—three levels of analysis for interdisciplinary integration. Front Psychiatry. (2022) 13:989171. doi: 10.3389/fpsyt.2022.989171

PubMed Abstract | Crossref Full Text | Google Scholar

65. Winnicott DW. The capacity to be alone. Int J Psychoanal. (1958) 39(5):416–20. doi: 10.1016/j.tics.2013.02.002

PubMed Abstract | Crossref Full Text | Google Scholar

66. Jerusalem M, Lohaus A, Klein-Heßling J. Gesundheitsförderung im Kindesund Jugendalter. Göttingen: Hogrefe Verlag (2006).

67. Bowlby J. The nature of the child’s tie to his mother. Int J Psychoanal. (1958) 39:350–73. PMID: 13610508.13610508

PubMed Abstract | Google Scholar

68. Asher SR, Hymel S, Renshaw PD. Loneliness in children. Child Develop. (1984) 55(4):1456–64. doi: 10.2307/1130015

Crossref Full Text | Google Scholar

69. Winnicott DW. The Child, the Family, and the Outside World. Harmondsworth: Penguin (1964).

70. Klein M, Riviere J. Seelische Urkonflikte. Liebe, Hass und Schuldgefühl. Frankfurt am Main: Fischer Wissenschaft (1992).

71. Rook KS. Promoting social bonding: strategies for helping the lonely and socially isolated. Am Psychol. (1984) 39(12):1389–407. doi: 10.1037/0003-066X.39.12.1389

Crossref Full Text | Google Scholar

72. Eisenberger NI, Lieberman MD, Williams KD. Does rejection hurt? An FMRI study of social exclusion. Science. (2003) 302(5643):290–2. doi: 10.1126/science.1089134

PubMed Abstract | Crossref Full Text | Google Scholar

73. Eisenberger NI, Lieberman MD. Why rejection hurts: a common neural alarm system for physical and social pain. Trends Cogn Sci (Regul Ed). (2004) 8(7):294–300. doi: 10.1016/j.tics.2004.05.010

Crossref Full Text | Google Scholar

74. Price DD. Psychological and neural mechanisms of the affective dimension of pain. Science. (2000) 288(5472):1769–72. doi: 10.1126/science.288.5472.1769

PubMed Abstract | Crossref Full Text | Google Scholar

75. Cacioppo JT, Patrick W. Loneliness: Human Nature and the Need for Social Connection. New York: Norton & Co. (2008).

76. Honneth A. Der Kampf um Anerkennung: Zur moralischen Grammatik sozialer Konflikte. Frankfurt am Main: Suhrkamp (1994) [English translated by J. Anderson as: The Struggle for Recognition: The Moral Grammar of Social Conflicts. Cambridge: Polity Press (2005)].

77. Rotenberg KJ, MacKie K. Stigmatization of social and intimacy loneliness. Psychol Rep. (1999) 84(1):147–8. doi: 10.2466/pr0.1999.84.1.147

PubMed Abstract | Crossref Full Text | Google Scholar

78. Bohn C. Die soziale Dimension der Einsamkeit: Unter besonderer Berücksichtigung der Scham. Hamburg: Kovac Verlag (2008).

79. Spitzer M. Einsamkeit-die unerkannte Krankheit: schmerzhaft, ansteckend, tödlich. Droemer eBook. Link Einsamkeit (2018).

80. Cacioppo JT, Fowler JH, Christakis NA. Alone in the crowd: the structure and spread of loneliness in a large social network. J Pers Soc Psychol. (2009) 97(6):977–91. doi: 10.1037/a0016076

PubMed Abstract | Crossref Full Text | Google Scholar

81. Bellucci G. Positive attitudes and negative expectations in lonely individuals. Sci Rep. (2020) 10(1):18595. doi: 10.1038/s41598-020-75712-3

PubMed Abstract | Crossref Full Text | Google Scholar

82. Lucas GM, Knowles ML, Gardner WL, Molden DC, Jefferis VE. Increasing social engagement among lonely individuals: the role of acceptance cues and promotion motivations. Pers Soc Psychol Bull. (2010) 10:1346–59. doi: 10.1177/0146167210382662

Crossref Full Text | Google Scholar

83. Xie T, Iryna P. Attachment theory as a framework to understand relationships with social chatbots: a case study of replika. In: Hawaii International Conference on System Sciences 2022; 2022 Jan 4–7, University of Hawai'i at Mānoa Hamilton Library, Honolulu, HI, United States. Bari: Hawaii International Conference on System Sciences (2022). doi: 10.24251/HICSS.2022.258. Available online at: https://scholarspace.manoa.hawaii.edu/handle/10125/79590. (accessed February 24, 2024).

84. Cañamero L. Emotion understanding from the perspective of autonomous robots research. Neural Netw. (2005) 18:445–55. doi: 10.1016/j.neunet.2005.03.003

Crossref Full Text | Google Scholar

85. Soleymani M, Lichtenauer J, Pun T, Pantic M. A multimodal database for affect recognition and implicit tagging. IEEE Trans Affect Comput. (2011) 3:42–55. doi: 10.1109/T-AFFC.2011.25

Crossref Full Text | Google Scholar

86. Shao M, Alves SFR, Ismail O, Zhang X, Nejat G, Benhabib B. You are doing great! Only one rep left: an affect-aware social robot for exercising. In: 2019 IEEE International Conference on Systems, Man and Cybernetics (SMC); 2019 Oct 6–9, Bari, Italy. Bari: IEEE Press (2019). p. 3811–7. doi: 10.1109/SMC.2019.8914198

87. Spezialetti M, Placidi G, Rossi S. Emotion recognition for human-robot interaction: recent advances and future perspectives. Sec. Smart sensor networks and autonomy. Front Robot. (2020) 7:532279. doi: 10.3389/frobt.2020.532279

Crossref Full Text | Google Scholar

88. Friedman C. Human-robot moral relations: human interactants as moral patients of their own agential moral actions towards robots. In: Gerber A, editor. Artificial Intelligence Research. SAAICR 2021. Communications in Computer and Information Science. Vol. 1342. Cham: Springer (2020). doi: 10.1007/978-3-030-66151-9_1

89. Breazeal S, Scassellati B. How to build a robot that makes friends and influences people. In: Proceedings, 1999 IEEE/RSJ International Conference on Intelligent Robots and Systems: human and environment friendly robots with high intelligence and emotional quotients, 1999 Oct 17–21, Kyongju, Korea. Kyongju: IEEE Xplore (1999). p. 858–63. doi: 10.1109/IROS.1999.812787

90. Reckwitz A. Die Gesellschaft der Singularitäten. Zum Strukturwandel der Moderne. Berlin: Suhrkamp (2018). doi: 10.1007/s11616-018-0456-7-citeas

91. Skjuve M, Følstad A, Fostervoldb KI, Brandtzaegab BP. My chatbot companion—a study of human-chatbot relationships. Int J Hum Compt Stud. (2021) 149:1–14. doi: 10.1016/j.ijhcs.2021.102601

Crossref Full Text | Google Scholar

92. Gao J, Galley M, Li L. Neural approaches to conversational AI. In: SIGIR '18: The 41st International ACM SIGIR Conference on Research and Development in Information Retrieval; 2018 July 8–12, Ann Arbor, MI, United States. New York: Association for Computing Machinery. New York: Association for Computing Machinery New York (2018). doi: 10.48550/arXiv.1809.08267

93. Purington A, Taft JG, Sannon S, Bazarova NN, Taylor SH. “Alexa is my new BFF”: social roles, user satisfaction, and personification of the amazon echo. In: Proceedings of the 2017 CHI Conference Extended Abstracts on Human Factors in Computing Systems; 2017 May 6–11, Denver, CO, United States. New York: Association for Computing Machinery. (2017). p. 2853–9.

94. Archer MS. Friendship between human beings and AI robots? In: von Braun J, S. Archer M, Reichberg GM, Sánchez Sorondo M, editors. Robotics, AI, and Humanity. Cham: Springer (2021). p. 177–89. doi: 10.1007/978-3-030-54173-6_15

95. Johnston L. Japanese Man Marries Hatsune Miku Gatebox Device. (2018) OTAQUEST. Available online at: https://www.otaquest.com/hatsune-miku-gatebox-marriage/ (accessed August 19, 2023).

96. Dreyfus HL. Die Grenzen künstlicher Intelligenz. Was Computer nicht können. Königsstein: Athenäum (1985).

97. Böhme G. Ethik leiblicher Existenz. Frankfurt am Main: Suhrkamp (2008).

98. Kurzweil R. KI – Das Zeitalter der Künstlichen Intelligenz. München: Hanser Verlag (1993).

99. Lacan J. Le séminaire, livre II: le moidans la théorie de Freud et dans la technique de la psychoanalyse. Paris: Seuil (1978).

100. Possati LM. The Algorithmic Unconscious. How Psychoanalysis Helps in Understanding AI. London: Routledge (2021). doi: 10.4324/9781003141686

101. Picard R. Affective Computing. Cambridge, MA: MIT Press (1997).

102. Asada M. Development of artificial empathy. Neurosci Res. (2015) 90:41–50. doi: 10.1016/j.neures.2014.12.002

PubMed Abstract | Crossref Full Text | Google Scholar

103. MacStay A. Emotional AI. The Rise of Empathic Media. London: SAGE (2018).

104. Manzeschke A. Robots in care. On people, machines, and other helpful entities. In: Rubeis G, Hartmann KV, Primc N, editors. Digitalisierung in der Pflege. Vandenhoeck & Rupprecht: Göttingen (2022). p. 201–10. doi: 10.14220/9783737014793.201

105. Beck S. Zum Einsatz von Robotern im Palliativ—und Hospitzbereich. MedR. (2018) 36:773–8. doi: 10.1007/s00350-018-5046-1

Crossref Full Text | Google Scholar

106. Beck S, Faber M, Gerndt S. Rechtliche Aspekte des Einsatzes von KI und Rpbotik in Medizin und Pflege. Eth Med. (2023) 35:247–63. doi: 10.1007/s00481-023-00763-9

Crossref Full Text | Google Scholar

107. Floridi L, Sanders JW. On the morality of artificial agents. Minds Mach. (2004) 14(3):349–79. doi: 10.1023/b:mind.0000035461.63578.9d

Crossref Full Text | Google Scholar

108. Capurro R. Towards an ontological foundation of information ethics. Ethics Inf Technol. (2006) 8(4):175–86. doi: 10.1007/s10676-006-9108-0

Crossref Full Text | Google Scholar

109. Weiss LG. Autonomous robots in the fog of war. IEEE Spectr. (2011) 48(8):30–57. doi: 10.1109/MSPEC.2011.5960163

Crossref Full Text | Google Scholar

110. Cartwright J. Rise of the robots and the future of war (2010). Available online at: https://www.theguardian.com/technology/2010/nov/21/military-robots-autonomous-machines (accessed August 19, 2023).

111. Sharevski F, Jachim P, Treebridge P, Li A, Babin A, Adadevoh C. Meet malexa, Alexa’s malicious twin: malware-induced misperception through intelligent voice assistants. Int J Hum Comput Stud. (2021) 149:102604. doi: 10.1016/j.ijhcs.2021.102604

Crossref Full Text | Google Scholar

112. Kernberg OF. Object Relations Theory and Clinical Psychoanalysis. Oxford: Jason Aronson Book/Rowman & Littlefield Pub (1976).

113. High Level Expert Group. Ethics guideline for trustworthy AI (2019). Available online at: https://digital-strategy.ec.europa.eu/en/policies/expert-group-ai (accessed August 15, 2023).

114. Schröder I, Müller O, Scholl H, Levy-Tzedek S, Kellmeyer P. Can robots be trustworthy? Ethik Med. (2023) 35:221–46. doi: 10.1007/s00481-023-00760-y

Crossref Full Text | Google Scholar

115. Breitner J. Paranoid psychoses in old age, much more common than previously thought? Arch Gen Psychiatry. (2002) 59:60–1. doi: 10.1001/archpsyc.59.1.60

PubMed Abstract | Crossref Full Text | Google Scholar

116. Habermas T. Geliebte Objekte: Symbole und Instrumente der Identitätsbildung. Berlin: Walter de Gruyter GmbH & Co KG (2020). Available online at: https://www.suhrkamp.de/buch/tilmann-habermas-geliebte-objekte-t-9783518290149. (accessed February 24, 2024).

117. Kato TA, Teo AR, Tateno M, Watabe M, Kubo H, Kanba S. Can Pokémon GO rescue shut-ins (hikikomori) from their isolated world? Psych Clin Neurosci. (2017) 71(1):75–6. doi: 10.1111/pcn.12481

Crossref Full Text | Google Scholar

118. Kato TA, Shinfuku N, Sartorius N, Kanba S. Loneliness and single-person households: issues of Kodoku-Shi and Hikikomori in Japan. In: Okkels N, Kristiansen C, Munk-Jorgensen P, editors. Mental Health and Illness in the City. Mental Health and Illness Worldwide. Singapore: Springer (2017). p. 1–15. doi: 10.1007/978-981-10-0752-1_9

119. Barbosa Neves B, Waycott J, Maddox A. When technologies are not enough: the challenges of digital interventions to address loneliness in later life. Sociol Res Online. (2021) 28(1):150–70. doi: 10.1177/13607804211029298

Crossref Full Text | Google Scholar

120. Horney K. The Neurotic Personality of Our Time. New York: W.W. Norton and Company (1937).

121. Schetsche M, Lehmann K. Netzwerker-Perspektiven. Bausteine einer praktischen Soziologie des Internet. Regensburg: S. Roderer (2003).

122. Simmel G. Soziologie. Untersuchungen über die Formen der Vergesellschaftung. Berlin: Duncker & Humblot (1908).

123. Fromm E. Die Furcht vor der Freiheit. New York: Holt, Rinehart & Winston (1941).

124. Nietzsche F. Also sprach Zarathustra. III. In: Colli G, Montinari M, editors. 4 Also Sprach Zarathustra I–IV. Kritische Studienausgabe. Berlin: De Gruyter (2021).

125. Tamaki S. Hikikomori: Adolescence Without End. (J. Angles, Trans.). Minneapolis: University of Minnesota Press (2013).

126. Cacioppo JT, Hawkley L. People thinking about people: the vicious cycle of being a social outcast in one’s own mind. In: Williams KD, Forgas JP, Von Hippel W, editors. The Social Outcast: Ostracism, Social Exclusion, Rejection, and Bullying. New York: Psychology Press (2005). p. 91–108.

127. Habermas J. Theorie des kommunikativen Handelns. Band 2. Zur Kritik der funktionalistischen Vernunft. Frankfurt: Suhrkamp (1981).

128. Deutscher P, Lafont C, editors. Critical Theory in Critical Times: Transforming the Global Political and Economic Order. New York: Columbia University Press (2017). doi: 10.7312/deut18150

129. Ehrenberg A. La Fatigue d`être soi. Dépression et société. Paris: Poches Odile Jacob (1998). p. 778–80.

130. Merton R. Social structure and anomie. Am Sociol Rev. (1938) 3(5):672–82. doi: 10.2307/2084686

Crossref Full Text | Google Scholar

131. Speckhard A, Ellenberg M, Morton J, Ash A. Involuntary celibates’ experiences of and grievance over sexual exclusion and the potential threat of violence among those active in an online Incel forum. J Strateg Secur. (2020) 14(2):89–121. doi: 10.5038/1944-0472.14.2.1910

Crossref Full Text | Google Scholar

132. Freud S. Das Unbehagen in der Kultur. (1930) Wien. G.S., 12, 29; G.W., 14, 421. (248) [Trans. “Civilization and its Discontents,” London and New York (1930). In: J. Strachey, editor. The Standard Edition of the Complete Psychological Works of Sigmund Freud. 1955(21), Macmillan].

133. Schmalenbach H. Die Genealogie der Einsamkeit. In: Kroner R, Mehlis G, editors. Logos. Internationale Zeitschrift für Philosophie der Kultur. Tübingen: Verlag von J.C.B. Mohr (Paul Siebick) (1919–20). p. VIII.

Keywords: (digital) loneliness, recognition, AI companionship, critical theory, meaningful relations

Citation: Jacobs KA (2024) Digital loneliness—changes of social recognition through AI companions. Front. Digit. Health 6:1281037. doi: 10.3389/fdgth.2024.1281037

Received: 11 December 2023; Accepted: 15 February 2024;
Published: 5 March 2024.

Edited by:

Erik Børve Rasmussen, Oslo Metropolitan University, Norway

Reviewed by:

Gunhild Tøndel, Norwegian University of Science and Technology, Norway
Beenish Chaudhry, University of Louisiana at Lafayette, United States

© 2024 Jacobs. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

*Correspondence: Kerrin Artemis Jacobs kjacobs@let.hokudai.ac.jp

Disclaimer: All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article or claim that may be made by its manufacturer is not guaranteed or endorsed by the publisher.