The thesis focuses on the cohomological interpretation of entropy and other related quantities, e.g. multinomial coefficients. It shows that information has topological meaning and topology serves as a unifying framework. I developed a combinatorial analogue of the fundamental equation of information theory (which provides a new characterization of generalized multinomial coefficients) and introduced a cohomological characterization of differential entropy for gaussian variables (that generalizes the axiomatic approach of Shannon, Khinchin…).
J.P. Vigneaux, “Information structures and their cohomology,” in Theory and Applications of Categories, Vol. 35, 2020, No. 38, pp 1476-1529.
D. Bennequin and J.P. Vigneaux, “A functional equation related to generalized entropies and
the modular group,” in Aequat. Math. (2020). https://doi.org/10.1007/s00010-020-00717-2
J.P. Vigneaux, “A homological characterization of generalized multinomial coefficients related to the entropic chain rule”, arXiv:2003.02021
Slides used at my PhD defense, June 2019.
Presentation at OASIS Seminar: “Information cohomology: an overview.” Oxford, England, 2019.
Presentation at LAWCI 2018: “Information theory for Tsallis 2-entropy.” Latin American Week on Coding and Information, Campinas, Brazil, 2018.
Poster: “A combinatorial interpretation for Tsallis 2-entropy” at Entropy 2018, Barcelona.
Presentation at AAT 2017: “Information topology and probabilistic graphical
models.” Conference Applied Algebraic Topology, Sapporo, Japan, 2017.
Presentation IHES: “”Variations on information theory: categories, cohomology, entropy.” Slides for my talk at the conference Les Probabilités de Demain, 2016.