Tight upper bounds on the number of homomorphic images of a hypergraph in another.

Dan Suciu (University of Washington)

Wed Jun 5, 15:00-16:15 (4 months ago)

Abstract: Given two hypergraphs $H$, $G$, we are interested in the number of homomorphisms $H \to G$. The hyperedges are typed, and homomorphisms need to preserve the type. The graph $G$ is unknown, instead we only know statistics about $G$, such as the total number of edges of each type, or information about their degree sequences. The problem is to compute an upper bound on the number of homomorphisms $H \to G$, when $G$ satisfies the given statistics. We say that this bound is tight within a constant $c \le 1$ if there exists a graph $G$ satisfying the given statistics for which the number of homomorphisms is at least c times the bound.

We start with the AGM bound (by Atserias, Grohe, Marx), which strengthens a prior result by Friedgut and Kahn. This bound uses only the cardinalities of each type of hyperedge of the hypergraph $G$. We will prove the AGM bound by using a numerical inequality due to Friedgut, which generalizes Cauchy-Schwartz and Holder's inequalities. The AGM bound can be computed in polynomial time in the size of $H$, and is tight within a constant $1/2^k$, where $k$ is the number of vertices in $H$.

Next, we prove a general bound, which uses both the cardinalities of each type of hyperedge, and the norms of their degree sequences. In particular, the $\ell_p$ norms of their degree sequences. In particular, the $\ell_\infty$ norm is the largest degree of that type of hyperedges in $G$. The proof uses information inequalities, and for that reason we call the bound the "entropic bound". However, it is an open problem whether the entropic bound is computable.

Finally, we restrict the statistics to "simple" statistics, where all degrees are from a single node, as opposed to tuples of nodes: in particular, if $G$ is a graph, then all statistics are simple. In this case we show that the entropic vector in the entropic bound can be restricted to be one with non-negative $I$-measure. Furthermore, the entropic bound is computable in polynomial time in the size of $H$, and it is tight within a constant $1/2^{2^k-1}$.

Computer scienceMathematics

Audience: researchers in the discipline


Seminar on Algorithmic Aspects of Information Theory

Series comments: This online seminar is a follow up of the Dagstuhl Seminar 22301, www.dagstuhl.de/en/program/calendar/semhp/?semnr=22301.

Organizer: Andrei Romashchenko*
*contact for this listing

Export talk to