(a) the concept of sample mean

(b) the concept of sample standard deviation

(c) the normal distribution

were all kind of comfortably canonical and mathematically inevitable like.

I just noticed (not that this is any less an elementary fact) that in the same way that MLE of the parameters of the Laplace distribution (1/2σ)exp(-|x-μ|/σ) — which, you know, although not as completely and utterly ubiquitous as the Gaussian, is still reasonably common — tells you to set μ to the

*median*of the sample, and σ to the "naïve standard deviation" that is the mean of every |x

_{i}-μ| for every x

_{i}in your sample. (I call it naïve because, well, I don't know about everyone else, but when I first learned about standard deviations I always wondered why you were supposed to square, average, and then square-root the thing — so this is what you'd get if you didn't square and square-root) The derivation of why it's the median is somewhat cute: it's because the derivative of |x| is just the sign of x, so all the signs have to balance.

I wonder which other statistics you can force to be the max-likelihood parameters of distributions? Can you mix and match them? Can you force more than two at once?