Information Inequalities for Five Random Variables.
Laszlo Csirmaz
Abstract: Split the base set N into the disjoint union YXZ, and let Y1,...,Yn be copies of Y, and Z1,...,Zm be copies of Z. For any probability distribution on YXZ there is another probability distribution on (Y1 ... Yn X Z1 ... Zm) such that the marginals on XYi and XY are the same; the marginals on XZj and XZ are the same; moreover the variable sets {Yi,Zj} are completely conditionally independent over X.
Applying this property for Y={cd}, X={ab}, and Z={z}, all consequences are computed for n<10. Based on the results, two infinite families of five-variable non-Shannon inequalities are defined and proved to be consequences of the above property. We investigate the “extremal” inequalities among them, their asymptotic behavior, and how they delimit the five-variable entropy region. At the end we discuss how they relate to Matus’ inequalities.
Computer scienceMathematics
Audience: researchers in the discipline
Seminar on Algorithmic Aspects of Information Theory
Series comments: This online seminar is a follow up of the Dagstuhl Seminar 22301, www.dagstuhl.de/en/program/calendar/semhp/?semnr=22301.
| Organizer: | Andrei Romashchenko* |
| *contact for this listing |
