Albert Atserias, Anuj Dawar, et al.
Journal of the ACM
Constraints on entropies are considered to be the laws of information theory. Even though the pursuit of their discovery has been a central theme of research in information theory, the algorithmic aspects of constraints on entropies remain largely unexplored. Here, we initiate an investigation of decision problems about constraints on entropies by placing several different such problems into levels of the arithmetical hierarchy. We establish the following results on checking the validity over all almost-entropic functions: first, validity of a Boolean information constraint arising from a monotone Boolean formula is co-recursively enumerable; second, validity of “tight” conditional information constraints is in Π03. Furthermore, under some restrictions, validity of conditional information constraints “with slack” is in Σ02, and validity of information inequality constraints involving max is Turing equivalent to validity of information inequality constraints (with no max involved). We also prove that the classical implication problem for conditional independence statements is co-recursively enumerable.
Albert Atserias, Anuj Dawar, et al.
Journal of the ACM
Jing Ao, Zehui Cheng, et al.
JDIQ
Balder ten Cate, Victor Dalmau, et al.
ICDT 2024
Benny Kimelfeld, Phokion G. Kolaitis
Communications of the ACM