Plank, I. S.; Koehler, J. C.; Nelson, A. M.; Koutsouleris, N.; Falter-Wagner, C. M. (2023): Automated extraction of speech and turn-taking parameters in autism allows for diagnostic classification using a multivariable prediction model. Frontiers in Psychiatry, 14: 1257569. ISSN 1664-0640
fpsyt-14-1257569.pdf
Die Publikation ist unter der Lizenz Creative Commons Namensnennung (CC BY) verfügbar.
Herunterladen (1MB)
Abstract
Autism spectrum disorder (ASD) is diagnosed on the basis of speech and communication differences, amongst other symptoms. Since conversations are essential for building connections with others, it is important to understand the exact nature of differences between autistic and non-autistic verbal behaviour and evaluate the potential of these differences for diagnostics. In this study, we recorded dyadic conversations and used automated extraction of speech and interactional turn-taking features of 54 non-autistic and 26 autistic participants. The extracted speech and turn-taking parameters showed high potential as a diagnostic marker. A linear support vector machine was able to predict the dyad type with 76.2% balanced accuracy (sensitivity: 73.8%, specificity: 78.6%), suggesting that digitally assisted diagnostics could significantly enhance the current clinical diagnostic process due to their objectivity and scalability. In group comparisons on the individual and dyadic level, we found that autistic interaction partners talked slower and in a more monotonous manner than non-autistic interaction partners and that mixed dyads consisting of an autistic and a non-autistic participant had increased periods of silence, and
Dokumententyp: | Artikel (Klinikum der LMU) |
---|---|
Organisationseinheit (Fakultäten): | 07 Medizin > Klinikum der LMU München > Klinik und Poliklink für Psychiatrie und Psychotherapie |
DFG-Fachsystematik der Wissenschaftsbereiche: | Lebenswissenschaften |
Veröffentlichungsdatum: | 05. Dez 2023 08:49 |
Letzte Änderung: | 07. Dez 2023 12:20 |
URI: | https://oa-fund.ub.uni-muenchen.de/id/eprint/1048 |
DFG: | Gefördert durch die Deutsche Forschungsgemeinschaft (DFG) - 491502892 |