|
Bayesian Representation of Stochastic Processes under Learning: de Finetti Revisited
Matthew O. Jackson
Ehud Kalai
Rann Smorodinsky
Abstract
A probability distribution governing the evolution of a stochastic process has infinitely many Bayesian representations of the form µ = ∫?µ?d&lgr;(?). Among these, a natural representation is one whose components ( µ?s) are learnable (one can approximate µ? by conditioning μ on observation of the process) and sufficient for prediction (µ?s predictions are not aided by conditioning on observation of the process). We show the existence and uniqueness of such a representation under a suitable asymptotic mixing condition on the process. This representation can be obtained by conditioning on the tail-field of the process, and any learnable representation that is sufficient for prediction is asymptotically like the tail-field representation. This result is related to the celebrated de Finetti theorem, but with exchangeability weakened to an asymptotic mixing condition, and with his conclusion of a decomposition into i.i.d. component distributions weakened to components that are learnable and sufficient for prediction.
|