Please use this identifier to cite or link to this item: http://hdl.handle.net/2445/180464
Full metadata record
DC FieldValueLanguage
dc.contributor.advisorPujol Vila, Oriol-
dc.contributor.authorSánchez Font, Núria-
dc.date.accessioned2021-10-08T09:47:22Z-
dc.date.available2021-10-08T09:47:22Z-
dc.date.issued2020-06-29-
dc.identifier.urihttp://hdl.handle.net/2445/180464-
dc.descriptionTreballs finals del Màster de Fonaments de Ciència de Dades, Facultat de matemàtiques, Universitat de Barcelona, Any: 2020, Tutor: Oriol Pujol Vilaca
dc.description.abstract[en] Reservoir computing (RC) is a learning technique used to infer the underlying dynamics given a set of sequential data points. For instance, it may learn the dynamics of an input sequence in order to produce a related output sequence or it may learn the dynamics of a certain data in order to be capable of predicting the following time steps. The neural network employed is composed by a single hidden layer along with an input and output layers. As we will see, reservoir computing is a recurrent neural network approach but with the main difference that it deterministically sets all the connections within the different components of the network with the exception of the output connections, since these will be the connections to be learnt. This is possible because of the so called echo states, which is the key concept behind the reservoir computing approach. Therefore, reservoir computing needs to learn a much lower number of parameters, which makes it computationally cheaper than other RNN approaches. However, this is not the only difference. As will be exposed later on, the learning procedure consists on performing a linear regression, which is less costly than the usual backpropagation. The reservoir computing technique has recently gained a lot of popularity thanks to the work of chaos theorist Edward Ott and four collaborators at the University of Maryland in the area of chaotic dynamical systems Pathak et al., 2018b and Pathak et al., 2017). In that work, they were able to predict the dynamics of some chaotic systems up to 8 Lyapunov times, which is an impressive distant horizon.ca
dc.format.extent60 p.-
dc.format.mimetypeapplication/pdf-
dc.language.isoengca
dc.rightscc-by-nc-nd (c) Núria Sánchez Font, 2020-
dc.rights.urihttp://creativecommons.org/licenses/by-nc-nd/3.0/es/*
dc.sourceMàster Oficial - Fonaments de la Ciència de Dades-
dc.subject.classificationAprenentatge automàtic-
dc.subject.classificationXarxes neuronals (Informàtica)-
dc.subject.classificationTreballs de fi de màster-
dc.subject.classificationCaos (Teoria de sistemes)-
dc.subject.otherMachine learning-
dc.subject.otherNeural networks (Computer science)-
dc.subject.otherMaster's theses-
dc.subject.otherChaotic behavior in systems-
dc.titleReservoir computing for learning the underlying dynamics of sequential data pointsca
dc.typeinfo:eu-repo/semantics/masterThesisca
dc.rights.accessRightsinfo:eu-repo/semantics/openAccessca
Appears in Collections:Programari - Treballs de l'alumnat
Màster Oficial - Fonaments de la Ciència de Dades

Files in This Item:
File Description SizeFormat 
codi_font.zipCodi font37.73 MBzipView/Open
tfm_sanchez_font_nuria.pdfMemòria4.26 MBAdobe PDFView/Open


This item is licensed under a Creative Commons License Creative Commons