(function(doc, html, url) { var widget = doc.createElement("div"); widget.innerHTML = html; var script = doc.currentScript; // e = a.currentScript; if (!script) { var scripts = doc.scripts; for (var i = 0; i < scripts.length; ++i) { script = scripts[i]; if (script.src && script.src.indexOf(url) != -1) break; } } script.parentElement.replaceChild(widget, script); }(document, '

Access to words and signs in bilinguals in a spoken and a signed language

What is it about?

Spoken languages are expressed through the audio-oral modality, while signed languages use the visuo-spatial modality. This study focuses on the influence of language modality when accessing a language in the brain. It also explores the impact of modality when accessing a spoken language when seeing a signed language, and vice versa (accessing a signed language when hearing a spoken language). To this end, we run a series of eye-tracking experiments.

Why is it important?

Our findings show how spoken and signed language access develop in time and how these languages relate in bilinguals. In the case of signed language processing, this study teases apart the contribution of two components of the signs: location and handshape, and the time-course of each (location precedes handshape).

Read more on Kudos…
The following have contributed to this page:
Saul Villameriel
' ,"url"));