Please use this identifier to cite or link to this item: http://hdl.handle.net/2445/173736
Full metadata record
DC FieldValueLanguage
dc.contributor.authorCorral, Álvaro-
dc.contributor.authorGarcía del Muro y Solans, Montserrat-
dc.date.accessioned2021-02-08T13:40:41Z-
dc.date.available2021-02-08T13:40:41Z-
dc.date.issued2020-02-05-
dc.identifier.issn1099-4300-
dc.identifier.urihttp://hdl.handle.net/2445/173736-
dc.description.abstractThe word-frequency distribution provides the fundamental building blocks that generate discourse in natural language. It is well known, from empirical evidence, that the word-frequency distribution of almost any text is described by Zipf's law, at least approximately. Following Stephens and Bialek (2010), we interpret the frequency of any word as arising from the interaction potentials between its constituent letters. Indeed, Jaynes' maximum-entropy principle, with the constrains given by every empirical two-letter marginal distribution, leads to a Boltzmann distribution for word probabilities, with an energy-like function given by the sum of the all-to-all pairwise (two-letter) potentials. The so-called improved iterative-scaling algorithm allows us finding the potentials from the empirical two-letter marginals. We considerably extend Stephens and Bialek's results, applying this formalism to words with length of up to six letters from the English subset of the recently created Standardized Project Gutenberg Corpus. We find that the model is able to reproduce Zipf's law, but with some limitations: the general Zipf's power-law regime is obtained, but the probability of individual words shows considerable scattering. In this way, a pure statistical-physics framework is used to describe the probabilities of words. As a by-product, we find that both the empirical two-letter marginal distributions and the interaction-potential distributions follow well-defined statistical laws.-
dc.format.extent19 p.-
dc.format.mimetypeapplication/pdf-
dc.language.isoeng-
dc.publisherMDPI-
dc.relation.isformatofReproducció del document publicat a: https://doi.org/10.3390/e22020179-
dc.relation.ispartofEntropy, 2020, vol. 22(2), num. 179-
dc.relation.urihttps://doi.org/10.3390/e22020179-
dc.rightscc-by (c) Corral, Álvaro et al., 2020-
dc.rights.urihttp://creativecommons.org/licenses/by/3.0/es-
dc.sourceArticles publicats en revistes (Física de la Matèria Condensada)-
dc.subject.classificationEntropia-
dc.subject.classificationLingüística matemàtica-
dc.subject.classificationDistribució (Teoria de la probabilitat)-
dc.subject.classificationProbabilitats-
dc.subject.otherEntropy-
dc.subject.otherMathematical linguistics-
dc.subject.otherDistribution (Probability theory)-
dc.subject.otherProbabilities-
dc.titleFrom Boltzmann to Zipf through Shannon and Jaynes-
dc.typeinfo:eu-repo/semantics/article-
dc.typeinfo:eu-repo/semantics/publishedVersion-
dc.identifier.idgrec700396-
dc.date.updated2021-02-08T13:40:41Z-
dc.rights.accessRightsinfo:eu-repo/semantics/openAccess-
Appears in Collections:Articles publicats en revistes (Física de la Matèria Condensada)

Files in This Item:
File Description SizeFormat 
700396.pdf1.17 MBAdobe PDFView/Open


This item is licensed under a Creative Commons License Creative Commons