Abstract: Two correlated sources emit a pair of sequences, each of which is observed by a different encoder. Each encoder produces a rate-limited description of the sequence it observes, and the two descriptions are presented to a guessing device that repeatedly produces sequence pairs until correct. The number of guesses until correct is random, and it is required that it have a moment (of some prespecified order) that tends to one as the length of the sequences tends to infinity. The description rate pairs that allow this are characterized in terms of the Ré
nyi entropy and the Arimoto&ndash
Ré
nyi conditional entropy of the joint law of the sources. This solves the guessing analog of the Slepian&ndash
Wolf distributed source-coding problem. The achievability is based on random binning, which is analyzed using a technique by Rosenthal.
No Comments.