
94% of researchers rate our articles as excellent or good
Learn more about the work of our research integrity team to safeguard the quality of each article we publish.
Find out more
PERSPECTIVE article
Front. Comput. Sci.
Sec. Human-Media Interaction
Volume 7 - 2025 | doi: 10.3389/fcomp.2025.1508004
This article is part of the Research Topic Digital Heritage Futures View all 3 articles
The final, formatted version of the article will be published soon.
You have multiple emails registered with Frontiers:
Please enter your email address:
If you already have an account, please login
You don't have a Frontiers account ? You can register here
This perspectives article argues that not only humanities benefit and are transformed by recent developments in AI but that also AI might benefit from the humanities. This is demonstrated with regard to the symbol grounding problem in AI, by taking into consideration that meaning is not the outcome of a two-way relation between an object and a brain (or AI), but of the negotiation of meaning in the triadic relation between objects, symbols and human practices. This is common in the interpretive social research tradition of the humanities. As we argue, AI benefits from embedding generative methods in interpretive social research methodologies. How this can be achieved is demonstrated with the example of the recently developed methodology of interpretive ABM (iABM). This methodology enables the generation of counterfactual narratives which are anchored in ethnographic evidence and hermeneutically interpreted, hence, producing symbolically grounded and plausible futures. Criteria for plausibility correspond to contemporary guidelines for the assessment of trustworthy AI, namely human agency and oversight, transparency and auditability.
Keywords: symbol grounding, interpretive ABM, Interpretive social research, Generative methods, Transparent AI
Received: 08 Oct 2024; Accepted: 24 Feb 2025.
Copyright: © 2025 Neumann and Dirksen. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) or licensor are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.
* Correspondence:
Martin Neumann, University of Southern Denmark, Odense, Denmark
Disclaimer: All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article or claim that may be made by its manufacturer is not guaranteed or endorsed by the publisher.
Research integrity at Frontiers
Learn more about the work of our research integrity team to safeguard the quality of each article we publish.