Please use this identifier to cite or link to this item:
https://hdl.handle.net/2445/217016
Title: | TopoGEN: topology-informed generative models |
Author: | Benarroch Jedlicki, Jack |
Director/Tutor: | Escalera Guerrero, Sergio Casacuberta, Carles |
Keywords: | Homologia Estadística matemàtica Aprenentatge automàtic Xarxes neuronals (Informàtica) Treballs de fi de grau Homology Mathematical statistics Machine learning Neural networks (Computer science) Bachelor's theses |
Issue Date: | 10-Jun-2024 |
Abstract: | The main goal of generative models is to learn an unknown distribution of data points in a high-dimensional space. However, generative models often face challenges such as mode collapse and time-extensive training processes. In order to address these problems, we propose a new way of training generative models, which relies on the implementation of topological regularizers extracting information from persistence diagrams. These topological loss terms inform about meaningful features of the spaces formed by the true data and the generated data, such as the presence of clusters, loops or higher-dimensional holes. We provide original proofs showing that the functions used are stable with respect to the data and generically differentiable, allowing their gradient descent based optimization. Some of the results obtained in this thesis are new results extending the current knowledge of differentiability through barcode space to more general classes of functions. As a consequence, this work expands the current possibilities of differentiable topological regularization. The developed topological regularizers are first tested in synthetic datasets, demonstrating their ability to continuously deform point clouds in order to obtain around truth topological features. Then, the regularizers are tested in variational autoencoders in the FashionMNIST dataset, and we observe that they provide an improved performance compared to their nonregularized counterparts. Furthermore, these loss terms can be applied at any layer of the generative models, opening new ways of controlling the performance of inner layers and the spatial distribution of data codes in the latent space. We thus explore this possible line of application, and the striking effects observed suggest that topological regularization may be a useful ingredient for training generative models. |
Note: | Treballs Finals de Grau de Matemàtiques, Facultat de Matemàtiques, Universitat de Barcelona, Any: 2024, Director: Sergio Escalera Guerrero i Carles Casacuberta |
URI: | https://hdl.handle.net/2445/217016 |
Appears in Collections: | Treballs Finals de Grau (TFG) - Matemàtiques Programari - Treballs de l'alumnat |
Files in This Item:
File | Description | Size | Format | |
---|---|---|---|---|
tfg_benarroch_jdlicki_jack.pdf | Memòria | 22.88 MB | Adobe PDF | View/Open |
TopoGEN-main.zip | Codi font | 40.5 MB | zip | View/Open |
This item is licensed under a
Creative Commons License