Jason (jcreed) wrote,
Jason
jcreed

So everyone knows from kindergarten that the maximum likelihood estimator fitting a normal distribution with mean μ and standard deviation σ to a sample is you just set μ to the mean of the sample, and σ to the standard deviation of the sample, right? I always kind of took this to be some kind of fuzzy corroborating sign that
(a) the concept of sample mean
(b) the concept of sample standard deviation
(c) the normal distribution
were all kind of comfortably canonical and mathematically inevitable like.

I just noticed (not that this is any less an elementary fact) that in the same way that MLE of the parameters of the Laplace distribution (1/2σ)exp(-|x-μ|/σ) — which, you know, although not as completely and utterly ubiquitous as the Gaussian, is still reasonably common — tells you to set μ to the median of the sample, and σ to the "naïve standard deviation" that is the mean of every |xi-μ| for every xi in your sample. (I call it naïve because, well, I don't know about everyone else, but when I first learned about standard deviations I always wondered why you were supposed to square, average, and then square-root the thing — so this is what you'd get if you didn't square and square-root) The derivation of why it's the median is somewhat cute: it's because the derivative of |x| is just the sign of x, so all the signs have to balance.

I wonder which other statistics you can force to be the max-likelihood parameters of distributions? Can you mix and match them? Can you force more than two at once?
Tags: math, statistics
Subscribe

  • Post a new comment

    Error

    Anonymous comments are disabled in this journal

    default userpic

    Your reply will be screened

    Your IP address will be recorded 

  • 19 comments