Abstract
English language teachers have usually separated the
teaching of reading from the teaching of speaking. By
decoupling the two skills, we may have overlooked their
potential synergy in affecting intonation. Visual reading
involves turning words into sounds, whereas reading aloud
involves scanning written words visually. This presentation
aims to encourage the audience to reconsider this close
relationship and think of how language tasks utilize the eyevoice coordination, if it exists.
This study compared the motion-by-moment eye-movement data collected from an EyeLink Portable Duo eye-tracker with the same participants’ intonation changes. Ninetysix participants of different proficiency levels participated: 50 university students (high English proficiency) and 46 secondary school students (low English proficiency). All participants read aloud stimulus materials from a screen in a soundproof phonology lab. Using five sentence patterns (e.g., emotionally charged sentences and contrastive sentences), the researchers were able to assess the way in which syntax and meaning affect eye-and-voice (eye-orvoice) coordination.
(a) A series of correlation analyses and t-tests revealed that syntactic patterns affected the eye-voice coordination in different ways. (b) Different eye-movement data (e.g., fixation counts and duration) revealed different results. Overall, the eye and the voice were evidently involved in the reading aloud of certain sentence types but not others. The study will conclude with a note on how analysis can be further deepened.
The main problem in today’s English language education is that teachers do not have tangible clues to indicate why poor speakers lack good intonation and why good speakers produce melodic speech. The phenomena of eye movements and intonation have been studied separately, but they have rarely been studied together. This study considers a simple yet intuitively sound hypothesis: how the eye and the voice may work together to produce effective intonation, and what that means for teaching.
This study compared the motion-by-moment eye-movement data collected from an EyeLink Portable Duo eye-tracker with the same participants’ intonation changes. Ninetysix participants of different proficiency levels participated: 50 university students (high English proficiency) and 46 secondary school students (low English proficiency). All participants read aloud stimulus materials from a screen in a soundproof phonology lab. Using five sentence patterns (e.g., emotionally charged sentences and contrastive sentences), the researchers were able to assess the way in which syntax and meaning affect eye-and-voice (eye-orvoice) coordination.
(a) A series of correlation analyses and t-tests revealed that syntactic patterns affected the eye-voice coordination in different ways. (b) Different eye-movement data (e.g., fixation counts and duration) revealed different results. Overall, the eye and the voice were evidently involved in the reading aloud of certain sentence types but not others. The study will conclude with a note on how analysis can be further deepened.
The main problem in today’s English language education is that teachers do not have tangible clues to indicate why poor speakers lack good intonation and why good speakers produce melodic speech. The phenomena of eye movements and intonation have been studied separately, but they have rarely been studied together. This study considers a simple yet intuitively sound hypothesis: how the eye and the voice may work together to produce effective intonation, and what that means for teaching.
Original language | English |
---|---|
Publication status | Published - 4 Jul 2023 |
Event | 2023 International Conference on Open and Innovative Education (ICOIE 2023) - , Hong Kong Duration: 4 Jul 2023 → 6 Jul 2023 https://www.hkmu.edu.hk/URC/icoie2023_programme_book.pdf (Programme & Abstracts of Papers) |
Conference
Conference | 2023 International Conference on Open and Innovative Education (ICOIE 2023) |
---|---|
Country/Territory | Hong Kong |
Period | 4/07/23 → 6/07/23 |
Internet address |
|