About this Research Topic
With advancements in big data, artificial intelligence and technological advancements, there are new exciting possibilities rising in approach to screening for and supporting the diagnosing of mental health conditions. In primary care, explainable AI offers solutions that can streamline mental health evaluations by providing accessible and interpretable tools that align with physicians' workflows. Such approaches have the potential to overcome at least some barriers to identifying persons with mental health disorders, as they can be cost-effective, efficient, and can provide an unintrusive way of assessment that can be also used for regular monitoring for persons at risk.
Specific interesting potential can be found in the explainable, evidence-based approaches, that can translate existing knowledge to technology-supported screening and diagnosis of mental health conditions. Explainable AI tools can also empower primary care providers by demystifying complex algorithms and enabling them to make informed decisions based on AI-generated insights. Additionally, these tools can facilitate interdisciplinary collaboration between primary care physicians and mental health specialists, bridging the gap in holistic care.
In this Research Topic, we aim to create a collection of papers on recent advancements in the field of explainable AI approaches to screening and/or supporting the process of diagnosing mental health disorders. This Topic welcomes studies from basic research to implementation of such solutions in healthcare, with areas including but not limited to:
• Evaluation of markers that can be exploited during screening,
• Building algorithms and their validation,
• Applications of AI screening and diagnosis in practice,
• Integration of AI into primary care workflows to enhance early detection and management of mental health conditions.
• Validation or comparison of AI-based approaches to standard psychometric tools,
• AI tools for addressing mental health comorbidities in patients with chronic physical conditions.
• The role of AI in improving care equity in underserved or resource-constrained settings.
We welcome various article types: original research, review studies, policy and practice reviews and policy briefs, hypothesis and theory papers, methods papers, perspective papers, clinical trials, case studies, technology and code reports, and brief research reports.
Keywords: explainable AI, artificial intelligence, mental health, screening, diagnosing, monitoring, depression
Important Note: All contributions to this Research Topic must be within the scope of the section and journal to which they are submitted, as defined in their mission statements. Frontiers reserves the right to guide an out-of-scope manuscript to a more suitable section or journal at any stage of peer review.