Please use this identifier to cite or link to this item: http://artemis.cslab.ece.ntua.gr:8080/jspui/handle/123456789/19931
Full metadata record
DC FieldValueLanguage
dc.contributor.authorZografakis, Georgios-
dc.date.accessioned2025-11-14T08:26:51Z-
dc.date.available2025-11-14T08:26:51Z-
dc.date.issued2025-10-29-
dc.identifier.urihttp://artemis.cslab.ece.ntua.gr:8080/jspui/handle/123456789/19931-
dc.description.abstractUnderstanding the neural signatures that derive from temporal and spatial dynamics of language comprehension is crucial. In this thesis, we investigated the electrophysiological signatures of rhyme detection using EEG recordings from 21 participants who performed a rhyming judgment task. During the experiment, participants listened to sentences presented acoustically and were subsequently shown a visually presented word, to which they indicated whether it rhymed with a word in the preceding sentence. In order to analyse the EEG data, specific preprocessing steps took place. Firstly, the EEG data were epoched from −1.5 sec to +2.0 sec relative to the onset of the visual word and then pre-processed with filtering and ICA for artifact rejection. Afterwards, a linear discrimination analysis classifier was trained to distinguish rhyming from non-rhyming trials in sliding windows of 70 ms advanced in 20 ms steps across the epoch. For each window, EEG signals were vectorized and decoding performance was quantified using the area under the curve (AUC). This approach produced decoding curves that revealed the intervals in which the EEG signal differentiated between the two conditions. To examine the spatial contribution of electrodes, we extracted the classifier weight vectors and projected them onto the scalp, generating topographic maps at group peak latencies. At the group level, decoding curves were averaged from all the participants, and the LDA weight vectors from peak windows were combined to derive a representative scalp distribution. The group-level average revealed a modest but reliable increase in performance, peaking at 475 ms (Az=0.55). The associated scalp map showed frontal-central positivity combined with posterior negativity, indicating a distributed anterior–posterior pattern of neural activity supporting rhyme discrimination. These results suggest that EEG carries temporally specific and spatially distributed information distinguishing rhyming from non-rhyming trials.en_US
dc.languageenen_US
dc.subjectElectroencephalography (EEG)en_US
dc.subjectLinear discriminant analysisen_US
dc.subjectIndependent Component Analysisen_US
dc.subjectEvent-related potentials (ERP)en_US
dc.subjectMultisensory processingen_US
dc.titleEEG-Based Classification of Rhyming and Non-Rhyming Trials in a Multisensory Language Task Using ICA and LDAen_US
dc.description.pages51en_US
dc.contributor.supervisorΔελής Ιωάννηςen_US
dc.departmentΆλλοen_US
Appears in Collections:Μεταπτυχιακές Εργασίες - M.Sc. Theses

Files in This Item:
File Description SizeFormat 
Thesis Zogra.pdf3.53 MBAdobe PDFView/Open


Items in Artemis are protected by copyright, with all rights reserved, unless otherwise indicated.