Output-Constrained Lossy Source Coding With Application to Rate-Distortion-Perception Theory

Li Xie*, Liangyan Li, Jun Chen, Zhongshan Zhang

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

Abstract

The distortion-rate function of output-constrained lossy source coding with limited common randomness is analyzed for the special case of the squared error distortion measure. An explicit expression is obtained when both the source and reconstruction distributions are Gaussian. This further leads to a partial characterization of the information-theoretic limit of quadratic Gaussian rate-distortion-perception coding, with the perception measure given by either the Kullback-Leibler divergence or the squared quadratic Wasserstein distance, from which Wagner's result for the perfect realism setting and Zhang et al.'s result for the unlimited common randomness setting can be recovered as special cases.

Original languageEnglish
JournalIEEE Transactions on Communications
DOIs
Publication statusAccepted/In press - 2024

Keywords

  • Kullback-Leibler divergence
  • Wasserstein distance
  • optimal transport
  • output-constrained source coding
  • rate-distortion-perception theory
  • squared error

Fingerprint

Dive into the research topics of 'Output-Constrained Lossy Source Coding With Application to Rate-Distortion-Perception Theory'. Together they form a unique fingerprint.

Cite this