Anticipating multisensory environments: Evidence for a supra-modal predictive system

dc.contributor.authorSabio-Albert, Marc
dc.contributor.authorFuentemilla Garriga, Lluís
dc.contributor.authorPérez-Bellido, Alexis
dc.date.accessioned2025-06-27T12:55:17Z
dc.date.available2025-06-27T12:55:17Z
dc.date.issued2025-01-01
dc.date.updated2025-06-27T12:55:17Z
dc.description.abstractOur perceptual experience is generally framed in multisensory environments abundant in predictive information. Previous research on statistical learning has shown that humans can learn regularities in different sensory modalities in parallel, but it has not yet determined whether multisensory predictions are generated through a modality-specific predictive mechanism or instead, rely on a supra-modal predictive system. Here, across two experiments, we tested these hypotheses by presenting participants with concurrent pairs of predictable auditory and visual low-level stimuli (i.e., tones and gratings). In different experimental blocks, participants had to attend the stimuli in one modality while ignoring stimuli from the other sensory modality (distractors), and perform a perceptual discrimination task on the second stimulus of the attended modality (targets). Orthogonal to the task goal, both the attended and unattended pairs followed transitional probabilities, so targets and distractors could be expected or unexpected. We found that participants performed better for expected compared to unexpected targets. This effect generalized to the distractors but only when relevant targets were expected. Such interactive effects suggest that predictions may be gated by a supra-modal system with shared resources across sensory modalities that are distributed according to their respective behavioural relevance.
dc.format.extent10 p.
dc.format.mimetypeapplication/pdf
dc.identifier.issn0010-0277
dc.identifier.urihttps://hdl.handle.net/2445/221836
dc.language.isoeng
dc.publisherElsevier B.V.
dc.relation.isformatofReproducció del document publicat a: https://doi.org/10.1016/j.cognition.2024.105970
dc.relation.ispartofCognition, 2025, vol. 254, 105970
dc.relation.urihttps://doi.org/10.1016/j.cognition.2024.105970
dc.rightscc by (c) Sabio-Albert, Marc et al., 2025
dc.rights.accessRightsinfo:eu-repo/semantics/openAccess
dc.rights.urihttps://creativecommons.org/licenses/by/4.0/
dc.sourceArticles publicats en revistes (Cognició, Desenvolupament i Psicologia de l'Educació)
dc.subject.classificationAprenentatge automàtic
dc.subject.classificationAprenentatge implícit
dc.subject.classificationPercepció
dc.subject.classificationPredicció (Psicologia)
dc.subject.classificationCognició
dc.subject.otherMachine learning
dc.subject.otherImplicit learning
dc.subject.otherPerception
dc.subject.otherPrediction (Psychology)
dc.subject.otherCognition
dc.titleAnticipating multisensory environments: Evidence for a supra-modal predictive system
dc.typeinfo:eu-repo/semantics/article
dc.typeinfo:eu-repo/semantics/publishedVersion

Fitxers

Paquet original

Mostrant 1 - 1 de 1
Carregant...
Miniatura
Nom:
894418.pdf
Mida:
3.44 MB
Format:
Adobe Portable Document Format

Paquet de llicències

Mostrant 1 - 1 de 1
Carregant...
Miniatura
Nom:
license.txt
Mida:
1.71 KB
Format:
Item-specific license agreed upon to submission
Descripció: