Published in

2009 IEEE International Symposium on Information Theory

DOI: 10.1109/isit.2009.5205736

Links

Tools

Export citation

Search in Google Scholar

Compression of graphical structures

Proceedings article published in 2009 by Yongwook Choi ORCID, Wojciech Szpankowski
This paper was not found in any repository, but could be made available legally by the author.
This paper was not found in any repository, but could be made available legally by the author.

Full text: Unavailable

Green circle
Preprint: archiving allowed
Green circle
Postprint: archiving allowed
Red circle
Published version: archiving forbidden
Data provided by SHERPA/RoMEO

Abstract

F. Brooks argues in there is ldquono theory that gives us a metric for information embodied in structurerdquo Shannon himself alluded to it fifty years earlier in his little known 1953 paper. Indeed, in the past information theory dealt mostly with ldquoconventional data,rdquo be it textual data, image, or video data. However, databases of various sorts have come into existence in recent years for storing ldquounconventional datardquo including biological data, Web data, topographical maps, and medical data. In compressing such data structures, one must consider two types of information: the information conveyed by the structure itself, and the information conveyed by the data labels implanted in the structure. In this paper, we attempt to address the former problem by studying information of graphical structures (i.e., unlabeled graphs). In particular, we consider the Erdos-Renyi graphs G(n; p) over n vertices in which edges are added randomly with probability p. We prove that the structural entropy of G(n; p) is (2 n)h(p) - log n! + o(1) = (2 n)h(p) - n log n + O(n); where h(p) = -p log p - (1 - p) log(1 - p) is the entropy rate of a conventional memoryless binary source. Then, we design a two-stage encoding that optimally compresses unlabeled graphs up to the first two leading terms of the structural entropy.