The neural basis of syntax is modality-independent: Evidence from functional magnetic resonance imaging with deaf signers

Zusammenfassung

The ability to combine words into sentences lies at the heart of the human capacity for language which is not bound to speech but can also be externalised in the form of sign languages. Syntactic processing in spoken and written language in hearing neurotypical people is subserved by a primarily left-hemispheric so-called “core language network” consisting of the posterior portion of the inferior frontal gyrus (pIFG) and the posterior middle temporal gyrus and superior temporal sulcus (pMTG/STS), yet it remains unclear whether deaf signers who acquired and use a sign language as their primary and preferred means for communication also recruit the same network when processing the grammatical structure of their native sign language. The present thesis set out to test this by first conducting a qualitative review of the published neuroimaging literature on the neural basis of sign language before carrying out the first quantitative assessment of the available data from studies with functional magnetic resonance imaging (fMRI) and positron emission tomography by means of an activation likelihood estimation meta-analysis. This indicated that the left pIFG (i.e., Broca’s area) constitutes a modality-independent hub for processing spoken, written, and sign language. Next, this thesis documents the development of a normed psycholinguistic data set for German Sign Language (DGS) which could subsequently be used for developing stimulus materials for an fMRI experiment. Because sign languages are externalised in the visuo-spatial modality, it was further necessary to develop a means for controlling the motion of the signer that is signing stimulus sentences in videos for an experiment using computer vision technology and automated pose estimation. Lastly, an fMRI experiment with deaf signers and a control group of hearing non-signers in combination with a meta-analysis of syntactic processing in spoken and written language in hearing people are described, to show that the left-hemispheric core language network recruited for processing syntax in hearing non-signers consisting of pIFG and pMTG/STS is also recruited by deaf signers for processing syntax in sign language. This demonstrates the universality of the brain’s core network for processing syntax and suggests that the human brain is not intrinsically specialised for speech but for processing abstract linguistic information such as syntax independent of language modality.

Typ
Publikation
In MPI Series in Human Cognitive and Brain Sciences
Datum