Please use this identifier to cite or link to this item: http://artemis.cslab.ece.ntua.gr:8080/jspui/handle/123456789/19931
Title: EEG-Based Classification of Rhyming and Non-Rhyming Trials in a Multisensory Language Task Using ICA and LDA
Authors: Zografakis, Georgios
Δελής Ιωάννης
Keywords: Electroencephalography (EEG)
Linear discriminant analysis
Independent Component Analysis
Event-related potentials (ERP)
Multisensory processing
Issue Date: 29-Oct-2025
Abstract: Understanding the neural signatures that derive from temporal and spatial dynamics of language comprehension is crucial. In this thesis, we investigated the electrophysiological signatures of rhyme detection using EEG recordings from 21 participants who performed a rhyming judgment task. During the experiment, participants listened to sentences presented acoustically and were subsequently shown a visually presented word, to which they indicated whether it rhymed with a word in the preceding sentence. In order to analyse the EEG data, specific preprocessing steps took place. Firstly, the EEG data were epoched from −1.5 sec to +2.0 sec relative to the onset of the visual word and then pre-processed with filtering and ICA for artifact rejection. Afterwards, a linear discrimination analysis classifier was trained to distinguish rhyming from non-rhyming trials in sliding windows of 70 ms advanced in 20 ms steps across the epoch. For each window, EEG signals were vectorized and decoding performance was quantified using the area under the curve (AUC). This approach produced decoding curves that revealed the intervals in which the EEG signal differentiated between the two conditions. To examine the spatial contribution of electrodes, we extracted the classifier weight vectors and projected them onto the scalp, generating topographic maps at group peak latencies. At the group level, decoding curves were averaged from all the participants, and the LDA weight vectors from peak windows were combined to derive a representative scalp distribution. The group-level average revealed a modest but reliable increase in performance, peaking at 475 ms (Az=0.55). The associated scalp map showed frontal-central positivity combined with posterior negativity, indicating a distributed anterior–posterior pattern of neural activity supporting rhyme discrimination. These results suggest that EEG carries temporally specific and spatially distributed information distinguishing rhyming from non-rhyming trials.
URI: http://artemis.cslab.ece.ntua.gr:8080/jspui/handle/123456789/19931
Appears in Collections:Μεταπτυχιακές Εργασίες - M.Sc. Theses

Files in This Item:
File Description SizeFormat 
Thesis Zogra.pdf3.53 MBAdobe PDFView/Open


Items in Artemis are protected by copyright, with all rights reserved, unless otherwise indicated.