Please use this identifier to cite or link to this item: https://hdl.handle.net/2445/220739
Full metadata record
DC FieldValueLanguage
dc.contributor.authorCourbin, Frédéric-
dc.contributor.authorBisigello, Laura-
dc.contributor.authorCastander, Francisco Javier-
dc.contributor.authorCarretero, Jorge-
dc.contributor.authorHumphrey, Ayla-
dc.contributor.authorEuclid Collaboration-
dc.date.accessioned2025-04-30T15:33:38Z-
dc.date.available2025-04-30T15:33:38Z-
dc.date.issued2023-
dc.identifier.issn0004-6361-
dc.identifier.urihttps://hdl.handle.net/2445/220739-
dc.description.abstractThe Euclid Space Telescope will provide deep imaging at optical and near-infrared wavelengths, along with slitless near-infrared spectroscopy, across ∼15 000 deg2 of the sky. Euclid is expected to detect ∼12 billion astronomical sources, facilitating new insights into cosmology, galaxy evolution, and various other topics. In order to optimally exploit the expected very large dataset, appropriate methods and software tools need to be developed. Here we present a novel machine-learning-based methodology for the selection of quiescent galaxies using broadband Euclid IE, YE, JE, and HE photometry, in combination with multi-wavelength photometry from other large surveys (e.g. the Rubin LSST). The ARIADNE pipeline uses meta-learning to fuse decision-tree ensembles, nearest-neighbours, and deep-learning methods into a single classifier that yields significantly higher accuracy than any of the individual learning methods separately. The pipeline has been designed to have ‘sparsity awareness’, such that missing photometry values are informative for the classification. In addition, our pipeline is able to derive photometric redshifts for galaxies selected as quiescent, aided by the ‘pseudolabelling’ semi-supervised method, and using an outlier detection algorithm to identify and reject likely catastrophic outliers. After the application of the outlier filter, our pipeline achieves a normalised mean absolute deviation of <∼0.03 and a fraction of catastrophic outliers of <∼0.02 when measured against the COSMOS2015 photometric redshifts. We apply our classification pipeline to mock galaxy photometry catalogues corresponding to three main scenarios: (i) Euclid Deep Survey photometry with ancillary ugriz, WISE, and radio data; (ii) Euclid Wide Survey photometry with ancillary ugriz, WISE, and radio data; and (iii) Euclid Wide Survey photometry only, with no foreknowledge of galaxy redshifts. In a like-for-like comparison, our classification pipeline outperforms UV J selection, in addition to the Euclid IE − YE, JE − HE and u − IE, IE − JE colour–colour methods, with improvements in completeness and the F1-score (the harmonic mean of precision and recall) of up to a factor of 2.-
dc.format.extent36 p.-
dc.format.mimetypeapplication/pdf-
dc.language.isoeng-
dc.publisherEDP Sciences-
dc.relation.isformatofReproducció del document publicat a: https://doi.org/10.1051/0004-6361/202244307-
dc.relation.ispartofAstronomy & Astrophysics, 2023, vol. 671, num.A99-
dc.relation.urihttps://doi.org/10.1051/0004-6361/202244307-
dc.rights(c) The European Southern Observatory (ESO), 2023-
dc.sourceArticles publicats en revistes (Institut de Ciències del Cosmos (ICCUB))-
dc.subject.classificationAprenentatge automàtic-
dc.subject.classificationTelescopis espacials-
dc.subject.classificationGalàxies-
dc.subject.classificationCosmologia-
dc.subject.otherMachine learning-
dc.subject.otherSpace telescopes-
dc.subject.otherGalaxies-
dc.subject.otherCosmology-
dc.titleEuclid preparation: XXII. Selection of quiescent galaxies from mock photometry using machine learning-
dc.typeinfo:eu-repo/semantics/article-
dc.typeinfo:eu-repo/semantics/publishedVersion-
dc.identifier.idgrec755839-
dc.date.updated2025-04-30T15:33:38Z-
dc.rights.accessRightsinfo:eu-repo/semantics/openAccess-
Appears in Collections:Articles publicats en revistes (Institut de Ciències del Cosmos (ICCUB))

Files in This Item:
File Description SizeFormat 
883632.pdf5.17 MBAdobe PDFView/Open


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.