I went digging a little more into this stuff about characterizing Shannon entropy with nice sets of axioms, and got into Rényi's On Measures of Entropy and Information. The main lemma right at the beginning there is that the only completely multiplicative function (that is, a function with f(mn) = f(m) + f(n) for all positive integers m,n) that is "eventually stable" (meaning that the limit of the difference f(m+1) - f(m) is zero) is a logarithm in some base.

I thought this was kind of neat, so I tried reconstructing/streamlining Rényi's proof. Here's what I arrived at.

Allegedly Erdős proved that this still works with the "completely multiplicative" part weakened to just "multiplicative", where you require f(mn) = f(m) + f(n) only for m and n that are relatively prime. After staring at the problem for a while, I think I have a vague sketch of a proof, but it's very δ-ε fiddly and I'm not sure it actually works.
Tags:
• #### (no subject)

A paper on describing circuits in an agda DSL: http://www.staff.science.uu.nl/~swier004/publications/2015-types-draft.pdf

• #### (no subject)

Going more carefully now through this little tutorial on fpga programming with the iCEstick. It's in spanish, which makes it slightly more…

• #### (no subject)

Some further progress cleaning up the https://xkcd.com/1360/ -esque augean stables that is my hard drive. Tomato chicken I made a couple days ago…

• #### Error

Anonymous comments are disabled in this journal

default userpic