TopoGEN: topology-informed generative models

dc.contributor.advisorEscalera Guerrero, Sergio
dc.contributor.advisorCasacuberta, Carles
dc.contributor.authorBenarroch Jedlicki, Jack
dc.date.accessioned2024-12-11T09:39:51Z
dc.date.available2024-12-11T09:39:51Z
dc.date.issued2024-06-10
dc.descriptionTreballs Finals de Grau de Matemàtiques, Facultat de Matemàtiques, Universitat de Barcelona, Any: 2024, Director: Sergio Escalera Guerrero i Carles Casacubertaca
dc.description.abstractThe main goal of generative models is to learn an unknown distribution of data points in a high-dimensional space. However, generative models often face challenges such as mode collapse and time-extensive training processes. In order to address these problems, we propose a new way of training generative models, which relies on the implementation of topological regularizers extracting information from persistence diagrams. These topological loss terms inform about meaningful features of the spaces formed by the true data and the generated data, such as the presence of clusters, loops or higher-dimensional holes. We provide original proofs showing that the functions used are stable with respect to the data and generically differentiable, allowing their gradient descent based optimization. Some of the results obtained in this thesis are new results extending the current knowledge of differentiability through barcode space to more general classes of functions. As a consequence, this work expands the current possibilities of differentiable topological regularization. The developed topological regularizers are first tested in synthetic datasets, demonstrating their ability to continuously deform point clouds in order to obtain around truth topological features. Then, the regularizers are tested in variational autoencoders in the FashionMNIST dataset, and we observe that they provide an improved performance compared to their nonregularized counterparts. Furthermore, these loss terms can be applied at any layer of the generative models, opening new ways of controlling the performance of inner layers and the spatial distribution of data codes in the latent space. We thus explore this possible line of application, and the striking effects observed suggest that topological regularization may be a useful ingredient for training generative models.ca
dc.format.extent95 p.
dc.format.mimetypeapplication/pdf
dc.identifier.urihttps://hdl.handle.net/2445/217016
dc.language.isoengca
dc.rightscc-by-nc-nd (c) Jack Benarroch Jedlicki, 2024
dc.rightscodi: GPL (c) Jack Benarroch Jedlicki, 2024
dc.rights.accessRightsinfo:eu-repo/semantics/openAccessca
dc.rights.urihttp://creativecommons.org/licenses/by-nc-nd/3.0/es/*
dc.rights.urihttp://www.gnu.org/licenses/gpl-3.0.ca.html
dc.sourceTreballs Finals de Grau (TFG) - Matemàtiques
dc.subject.classificationHomologiaca
dc.subject.classificationEstadística matemàtica
dc.subject.classificationAprenentatge automàticca
dc.subject.classificationXarxes neuronals (Informàtica)ca
dc.subject.classificationTreballs de fi de grauca
dc.subject.otherHomologyen
dc.subject.otherMathematical statistics
dc.subject.otherMachine learningen
dc.subject.otherNeural networks (Computer science)en
dc.subject.otherBachelor's thesesen
dc.titleTopoGEN: topology-informed generative modelsca
dc.typeinfo:eu-repo/semantics/bachelorThesisca

Fitxers

Paquet original

Mostrant 1 - 2 de 2
Carregant...
Miniatura
Nom:
tfg_benarroch_jdlicki_jack.pdf
Mida:
22.35 MB
Format:
Adobe Portable Document Format
Descripció:
Memòria
Carregant...
Miniatura
Nom:
TopoGEN-main.zip
Mida:
39.55 MB
Format:
ZIP file
Descripció:
Codi font