# Difference between revisions of "Bayesian Entropy and Inference"

(Imported from text file) |
(Imported from text file) |
||

Line 12: | Line 12: | ||

==Abstract== | ==Abstract== | ||

− | The traditional definition of entropy employed in statistical mechanics and in Shannon's information theory, −npn ln (pn), may be viewed as a noninvariant special case (associated with an implicit uniform prior) of an invariant covering theory based on −npn ln (pn/p), where pn refers to a posterior probability distribution, as affected by the arrival of ?new data,? and p refers to a Bayesian prior probability distribution. This generalized or explicitly Bayesian form of ?entropy? thus quantifies the transition between two states of knowledge, prior and posterior, exactly as does Bayes' theorem, and may be considered to have the same scope and information content as that theorem. Constrained extremalization of this form of entropy is demonstrated to be useful in solving three types of classical probability problems: (1) those for which the availability or presumption of single-parameter information allows a Poisson distribution to serve as ?universal prior,? (2) those for which additional prior information justifies a known departure from the Poisson law, and (3) those for which statistical sampling provides arbitrary nonuniform prior information; that is, prior to additional data input. In all cases the ?new data? must be of the aggregated or summed type expressible as Lagrange constraints. By reference to an example taken from Denting and by extension of the proof of Shannon's ?composition law? (hitherto thought to be valid only for the traditional form of entropy), it is shown that use of Bayesian entropy can broaden the scope of information theory, with interpretation of ?information? as that which quantifies a transition between states of knowledge. Shannon's ?monotonicity law? becomes superfluous and can be eliminated. This generalized form of entropy also promises a more powerful means of treating nonequilibrium thermodynamics, by freeing statistical-mechanical entropy from implicit connection to the equilibrium (uniform prior) or thermostatic state.[[Category:Scientific Paper]] | + | The traditional definition of entropy employed in statistical mechanics and in Shannon's information theory, −npn ln (pn), may be viewed as a noninvariant special case (associated with an implicit uniform prior) of an invariant covering theory based on −npn ln (pn/p), where pn refers to a posterior probability distribution, as affected by the arrival of ?new data,? and p refers to a Bayesian prior probability distribution. This generalized or explicitly Bayesian form of ?entropy? thus quantifies the transition between two states of knowledge, prior and posterior, exactly as does Bayes' theorem, and may be considered to have the same scope and information content as that theorem. Constrained extremalization of this form of entropy is demonstrated to be useful in solving three types of classical probability problems: (1) those for which the availability or presumption of single-parameter information allows a Poisson distribution to serve as ?universal prior,? (2) those for which additional prior information justifies a known departure from the Poisson law, and (3) those for which statistical sampling provides arbitrary nonuniform prior information; that is, prior to additional data input. In all cases the ?new data? must be of the aggregated or summed type expressible as Lagrange constraints. By reference to an example taken from Denting and by extension of the proof of Shannon's ?composition law? (hitherto thought to be valid only for the traditional form of entropy), it is shown that use of Bayesian entropy can broaden the scope of information theory, with interpretation of ?information? as that which quantifies a transition between states of knowledge. Shannon's ?monotonicity law? becomes superfluous and can be eliminated. This generalized form of entropy also promises a more powerful means of treating nonequilibrium thermodynamics, by freeing statistical-mechanical entropy from implicit connection to the equilibrium (uniform prior) or thermostatic state. |

+ | |||

+ | [[Category:Scientific Paper|bayesian entropy inference]] |

## Latest revision as of 09:05, 1 January 2017

Scientific Paper | |
---|---|

Title | Bayesian Entropy and Inference |

Author(s) | Thomas E Phipps, Michael H Brill |

Keywords | Bayesian Entropy, Inference |

Published | 1995 |

Journal | Physics Essays |

Volume | 8 |

Number | 4 |

Pages | 615-625 |

## Abstract

The traditional definition of entropy employed in statistical mechanics and in Shannon's information theory, −npn ln (pn), may be viewed as a noninvariant special case (associated with an implicit uniform prior) of an invariant covering theory based on −npn ln (pn/p), where pn refers to a posterior probability distribution, as affected by the arrival of ?new data,? and p refers to a Bayesian prior probability distribution. This generalized or explicitly Bayesian form of ?entropy? thus quantifies the transition between two states of knowledge, prior and posterior, exactly as does Bayes' theorem, and may be considered to have the same scope and information content as that theorem. Constrained extremalization of this form of entropy is demonstrated to be useful in solving three types of classical probability problems: (1) those for which the availability or presumption of single-parameter information allows a Poisson distribution to serve as ?universal prior,? (2) those for which additional prior information justifies a known departure from the Poisson law, and (3) those for which statistical sampling provides arbitrary nonuniform prior information; that is, prior to additional data input. In all cases the ?new data? must be of the aggregated or summed type expressible as Lagrange constraints. By reference to an example taken from Denting and by extension of the proof of Shannon's ?composition law? (hitherto thought to be valid only for the traditional form of entropy), it is shown that use of Bayesian entropy can broaden the scope of information theory, with interpretation of ?information? as that which quantifies a transition between states of knowledge. Shannon's ?monotonicity law? becomes superfluous and can be eliminated. This generalized form of entropy also promises a more powerful means of treating nonequilibrium thermodynamics, by freeing statistical-mechanical entropy from implicit connection to the equilibrium (uniform prior) or thermostatic state.