On conditional Ingleton inequalities

Milan Studeny (Institute of Information Theory and Automation, Prague)

01-Mar-2023, 16:00-17:15 (14 months ago)

Abstract: Linear information inequalities valid for entropy functions induced by discrete random variables play an important role in the task to characterize discrete conditional independence structures. Specifically, the so-called conditional Ingleton inequalities in the case of 4 random variables are in the center of interest: these are valid under conditional independence assumptions on the inducing random variables. The four inequalities of this form were earlier revealed: by Yeung and Zhang (1997), by Matúš (1999) and by Kaced and Romashchenko (2013). In our recent paper (2021) the fifth inequality of this type was found. These five information inequalities can be used to characterize all conditional independence structures induced by four discrete random variables. One of open problems in that 2021 paper was whether the list of conditional Ingleton inequalities over 4 random variables is complete: the analysis can be completed by a recent finding of Boege (2022) answering that question.

Computer scienceMathematics

Audience: researchers in the discipline

( paper )


Seminar on Algorithmic Aspects of Information Theory

Series comments: This online seminar is a follow up of the Dagstuhl Seminar 22301, www.dagstuhl.de/en/program/calendar/semhp/?semnr=22301.

Organizer: Andrei Romashchenko*
*contact for this listing

Export talk to