Please use this identifier to cite or link to this item: https://hdl.handle.net/2445/221885
Full metadata record
DC FieldValueLanguage
dc.contributor.authorMorucci, Piermatteo-
dc.contributor.authorGiannelli, Francesco-
dc.contributor.authorRichter, Craig Geoffrey-
dc.contributor.authorMolinaro, Nicola-
dc.date.accessioned2025-06-30T07:30:11Z-
dc.date.available2025-06-30T07:30:11Z-
dc.date.issued2025-04-14-
dc.identifier.issn1662-453X-
dc.identifier.urihttps://hdl.handle.net/2445/221885-
dc.description.abstractHearing spoken words can enhance the recognition of visual object categories. Yet, the mechanisms that underpin this facilitation are incompletely understood. Recent proposals suggest that words can alter visual processes by activating category-specific representations in sensory regions. Here, we tested the hypothesis that neural oscillations serve as a mechanism to activate language-generated visual representations. Participants performed a cue-picture matching task where cues were either spoken words, in their native or second language, or natural sounds, while their EEG and reaction times were recorded. Behaviorally, we found that images cued by words were recognized faster than those cued by natural sounds. This indicates that language activates more accurate semantic representations compared to natural sounds. A time-frequency analysis of cue-target intervals revealed that this label-advantage effect was associated with enhanced power in posterior alpha (9-11 Hz) and beta oscillations (17-19 Hz), both of which were larger when the image was preceded by a word compared to a natural sound. These results suggest that alpha and beta rhythms may play distinct functional roles to support language-mediated visual object recognition: alpha might function to amplify sensory representations in posterior regions, while beta may (re)activate the network states elicited by the auditory cue.-
dc.format.extent11 p.-
dc.format.mimetypeapplication/pdf-
dc.language.isoeng-
dc.publisherFrontiers Media-
dc.relation.isformatofReproducció del document publicat a: https://doi.org/10.3389/fnins.2025.1467249-
dc.relation.ispartofFrontiers in Neuroscience, 2025, vol. 19-
dc.relation.urihttps://doi.org/10.3389/fnins.2025.1467249-
dc.rightscc-by (c) Morucci et al., 2025-
dc.rights.urihttp://creativecommons.org/licenses/by/3.0/es/*
dc.sourceArticles publicats en revistes (Cognició, Desenvolupament i Psicologia de l'Educació)-
dc.subject.classificationElectrofisiologia-
dc.subject.classificationPercepció visual-
dc.subject.classificationNeurolingüística-
dc.subject.otherElectrophysiology-
dc.subject.otherVisual perception-
dc.subject.otherNeurolinguistics-
dc.titleSpoken words affect visual object recognition via the modulation of alpha and beta oscillations-
dc.typeinfo:eu-repo/semantics/article-
dc.typeinfo:eu-repo/semantics/publishedVersion-
dc.date.updated2025-06-26T11:46:22Z-
dc.rights.accessRightsinfo:eu-repo/semantics/openAccess-
dc.identifier.pmid40297533-
Appears in Collections:Articles publicats en revistes (Cognició, Desenvolupament i Psicologia de l'Educació)
Articles publicats en revistes (Institut d'lnvestigació Biomèdica de Bellvitge (IDIBELL))

Files in This Item:
File Description SizeFormat 
fnins-1-1467249 (1).pdf5.88 MBAdobe PDFView/Open


This item is licensed under a Creative Commons License Creative Commons