Jason (jcreed) wrote,

I went digging a little more into this stuff about characterizing Shannon entropy with nice sets of axioms, and got into Rényi's On Measures of Entropy and Information. The main lemma right at the beginning there is that the only completely multiplicative function (that is, a function with f(mn) = f(m) + f(n) for all positive integers m,n) that is "eventually stable" (meaning that the limit of the difference f(m+1) - f(m) is zero) is a logarithm in some base.

I thought this was kind of neat, so I tried reconstructing/streamlining Rényi's proof. Here's what I arrived at.

Allegedly Erdős proved that this still works with the "completely multiplicative" part weakened to just "multiplicative", where you require f(mn) = f(m) + f(n) only for m and n that are relatively prime. After staring at the problem for a while, I think I have a vague sketch of a proof, but it's very δ-ε fiddly and I'm not sure it actually works.
Tags: math

  • Post a new comment


    Anonymous comments are disabled in this journal

    default userpic

    Your reply will be screened

    Your IP address will be recorded