A Note on the Residual Entropy Function

Authors: 
Pietro Muliere, Giovanni Parmigiani, Nicholas G. Polson
Universita di Pavia, Duke University, University of Chicago

Nov 30 1991

Interest in the informational content of truncation leads to the study of the residual entropy function, that is the entropy of a right truncated random variable as a function of the truncation point. In this note we show that, under mild regularity conditions, the residual entropy function characterizes the probability distribution. We also derive relationships between residual entropy, monotonicity of the failure rate and stochastic dominance.

Information theoretic measures of distances between distributions are also revisited from a similar perspective. In particular we study the residual divergence between two positive random variables and investigate some of its monotonicity properties.

The results are relevant to information theory, reliability theory, search problems, and experimental design.

Keywords: 

Information, entropy, failure rate, dominance, Kullback-Leibler divergence

Manuscript: 

PDF icon 1992-23.pdf