{"id":24,"date":"2010-04-30T19:08:44","date_gmt":"2010-04-30T17:08:44","guid":{"rendered":"http:\/\/djalil.chafai.net\/blog\/?p=24"},"modified":"2019-11-10T19:03:02","modified_gmt":"2019-11-10T18:03:02","slug":"entropy","status":"publish","type":"post","link":"https:\/\/djalil.chafai.net\/blog\/2010\/04\/30\/entropy\/","title":{"rendered":"Entropies along dynamics and conservation laws"},"content":{"rendered":"<p style=\"text-align: justify;\">The <a title=\"Boltzmann's legacy\" href=\"http:\/\/plato.stanford.edu\/entries\/statphys-Boltzmann\/\">work of Boltzmann<\/a> on entropy in the years 1865-1905 is really amazing. Beyond important combinatorial aspects, one of the general ideas behind his work is that along certain dynamics, some functional is monotonic, and thus, the long time equilibrium, if it exists, is related to the optimum of the functional over the constraints related to the conservation laws. For the original Boltzmann equation $\\partial_tf_t=A(f_t)$ which comes from kinetic gases modelling, the entropy is $H(f)=-\\int\\!f(x)\\log f(x)dx$, and is maximized by Gaussians under a variance constraint. Here the variance constraint corresponds to the convervation law of the energy.  One may call entropies such functionals. Boltzmann was the first to use a partial differential equation to describe the evolution of a probability density function, dozens of years before the rigorous analysis of such concepts in mathematics.<\/p>\n<p style=\"text-align: justify;\">Of course, for nonlinear dynamics, <a title=\"Cercignani's conjecture is sometimes true and always almost true, by Villani\" href=\"http:\/\/www.ams.org\/mathscinet-getitem?mr=1964379\">the initial data may play a subtle role<\/a>. The same idea is present in the notion of gradient flow equations. Beyond statistical physics, the <a title=\"maximum entropy principle\" href=\"http:\/\/en.wikipedia.org\/wiki\/Maximum_entropy\">maximum entropy principle<\/a> plays a role in <a title=\"The Bayesian choice, by Robert\" href=\"http:\/\/www.ams.org\/mathscinet-getitem?mr=1835885\">Bayesian statistics<\/a>. It has also something to do with the consistency of the maximum likelihood estimator.<\/p>\n<p style=\"text-align: justify;\">For an ergodic and reversible Markov process, the equilibrium is typically a Gibbs measure, and the free energy plays the role of the entropy and is monotonic. A Gibbs measure is a maximum entropy under an averaged energy constraint involving the potential of the Gibbs measure. The monotonicity does not contradict the reversibility, because reversibility is a property of the equilibrium, and has nothing to do with the initial data.<\/p>\n<p style=\"text-align: justify;\">Another interesting problem involving entropy and monotonicity emerged from information theory and was stated by Shannon: does the entropy of Boltzmann is monotonic along the standard central limit theorem? How about the speed? Here the dynamics is related to independence and convolution and the conservation law is the variance. This problem was solved dozens of years later by many authors including <a title=\"Solution of Shannon's problem on the monotonicity of entropy, by Artstein, Ball, Barthe and Naor\" href=\"http:\/\/www.ams.org\/mathscinet-getitem?mr=2083473\">Artstein, Ball, Barthe and Naor<\/a>. The central limit theorem is available in many contexts beyond the classical Abelian case, including the Voiculescu operators algebras of free probability <a title=\"Shannon's monotonicity problem for free and classical entropy, by Shlyakhtenko\" href=\"http:\/\/www.ams.org\/mathscinet-getitem?mr=2346565\">(Shlyakhtenko has solved positively the problem)<\/a> and <a title=\"Limit theorems for random walks on Lie groups, by Stroock and  Varadhan\" href=\"http:\/\/www.ams.org\/mathscinet-getitem?mr=517406\">Lie groups<\/a>. It is tempting to formulate the Shannon conjecture on (non-compact) Lie groups with dilation. The answer in unknown, <a title=\"Information-Theoretic Inequalities on Unimodular Lie Groups, by Chirikjian \" href=\"http:\/\/arxiv.org\/abs\/0906.0330\">even for the Heisenberg group<\/a>.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>The work of Boltzmann on entropy in the years 1865-1905 is really amazing. Beyond important combinatorial aspects, one of the general ideas behind his work&#8230;<\/p>\n<div class=\"more-link-wrapper\"><a class=\"more-link\" href=\"https:\/\/djalil.chafai.net\/blog\/2010\/04\/30\/entropy\/\">Continue reading<span class=\"screen-reader-text\">Entropies along dynamics and conservation laws<\/span><\/a><\/div>\n","protected":false},"author":1,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"iawp_total_views":75},"categories":[1],"tags":[],"_links":{"self":[{"href":"https:\/\/djalil.chafai.net\/blog\/wp-json\/wp\/v2\/posts\/24"}],"collection":[{"href":"https:\/\/djalil.chafai.net\/blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/djalil.chafai.net\/blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/djalil.chafai.net\/blog\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/djalil.chafai.net\/blog\/wp-json\/wp\/v2\/comments?post=24"}],"version-history":[{"count":1,"href":"https:\/\/djalil.chafai.net\/blog\/wp-json\/wp\/v2\/posts\/24\/revisions"}],"predecessor-version":[{"id":11752,"href":"https:\/\/djalil.chafai.net\/blog\/wp-json\/wp\/v2\/posts\/24\/revisions\/11752"}],"wp:attachment":[{"href":"https:\/\/djalil.chafai.net\/blog\/wp-json\/wp\/v2\/media?parent=24"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/djalil.chafai.net\/blog\/wp-json\/wp\/v2\/categories?post=24"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/djalil.chafai.net\/blog\/wp-json\/wp\/v2\/tags?post=24"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}