Journal article icon

Journal article

Markov categories and entropy

Abstract:
Markov categories are a novel framework to describe and treat problems in probability and information theory. In this work we combine the categorical formalism with the traditional quantitative notions of entropy, mutual information, and data processing inequalities. We show that several quantitative aspects of information theory can be captured by an enriched version of Markov categories, where the spaces of morphisms are equipped with a divergence or even a metric. Following standard practices of information theory, we get measures of mutual information by quantifying, with a chosen divergence, how far a joint source is from displaying independence of its components. More strikingly, Markov categories give a notion of determinism for sources and channels, and we can define entropy exactly by quantifying how far a source or channel is from being deterministic. This recovers Shannon and Rényi entropies, as well as the Gini-Simpson index used in ecology to quantify diversity, and it can be used to give a conceptual definition of generalized entropy. No previous knowledge of category theory is assumed.
Publication status:
Published
Peer review status:
Peer reviewed

Actions


Access Document


Files:
Publisher copy:
10.1109/tit.2023.3328825

Authors


More by this author
Institution:
University of Oxford
Division:
MPLS
Department:
Computer Science
Role:
Author


Publisher:
IEEE
Journal:
IEEE Transactions on Information Theory More from this journal
Volume:
70
Issue:
3
Pages:
1671-1692
Publication date:
2023-10-31
Acceptance date:
2023-10-01
DOI:
EISSN:
1557-9654
ISSN:
0018-9448


Language:
English
Pubs id:
1701698
Local pid:
pubs:1701698
Deposit date:
2024-05-09

Terms of use



Views and Downloads






If you are the owner of this record, you can report an update to it here: Report update to this record

TO TOP