Tests of syntactic comprehension in aphasia: an investigation of task effects
Full text not archived in this repository.
To link to this article DOI: 10.1080/02687030802380165
Background: Consistency of performance across tasks that assess syntactic comprehension in aphasia has clinical and theoretical relevance. In this paper we add to the relatively sparse previous work on how sentence comprehension abilities are influenced by the nature of the assessment task. Aims: Our aims are: (1) to compare linguistic performance across sentence-picture matching, enactment, and truth-value judgement tasks; (2) to investigate the impact of pictorial stimuli on syntactic comprehension. Methods Procedures: We tested a group of 10 aphasic speakers (3 with fluent and 7 with non-fluent aphasia) in three tasks (Experiment 1): (i) sentence-picture matching with four pictures, (ii) sentence-picture matching with two pictures, and (iii) enactment. A further task of truth-value judgement was given to a subgroup of those speakers (n=5, Experiment 2). Similar sentence types across all tasks were used and included canonical (actives, subject clefts) and non-canonical (passives, object clefts) sentences. We undertook two types of analyses: (a) we compared canonical and non-canonical sentences in each task; (b) we compared performance between (i) actives and passives, (ii) subject and object clefts in each task. We examined the results of all participants as a group and as case-series. Outcomes Results: Several task effects emerged. Overall, the two-picture sentence-picture matching and enactment tasks were more discriminating than the four-picture condition. Group performance in the truth-value judgement task was similar to two-picture sentence-picture matching and enactment. At the individual level performance across tasks contrasted to some group results. Conclusions: Our findings revealed task effects across participants. We discuss reasons that could explain the diverse profiles of performance and the implications for clinical practice.
Centaur Editors: Update this record