## Wednesday, December 21, 2011

### Information and Entropy Econometrics

The eminent physicist Ed. Jaynes (1957a) wrote:
"Information theory provides a constructive criterion for setting up probability distributions on the basis of partial knowledge, and leads to a type of statistical inference which is called the maximum entropy estimate. It is least biased estimate possible on the given information; i.e., it is maximally noncommittal with regard to missing information."
In other words, when we want to describe noisy data with a statistical model, we should always choose the one that has Maximum Entropy.