- What are characteristic functions? They're like a Fourier transform of the probability density function. If the pdf of a random variable X is f, then its char. func. is φX(t) = ∫eitxf(x) dx
- They have the exponentialish property (uncoincidentally reminiscent of how the Fourier transform turns convolution into pointwise multiplication) that if X and Y are independent random variables, then φX+Y(t) = φX(t)φY(t)
- The first few terms of the Taylor series of the characteristic function of a random variable can be worked out just from its moments about zero: consider the defining formula, differentiate n times wrt t, and set t to 0. But for a factor of n! this is the nth derivative, and staring you in the face is a moment integral in∫xnf(x) dx
- In particular, if your mean is zero, and your standard deviation is one, your char. func.'s Taylor series starts off 1 - t2/2 + ...
- If you have n of these dudes all independent, then adding them all up yields a n-way product of their characteristic functions, which are all identical, so it's just an n-th power.
- After some normalization magic, this winds up resembling the formula (1 - t2/n)n
- So, dig deep and remember the limiting formula for the exponential, and realize the char. func. of the result is e-t2. This so happens to be that of the standard normal, QED
-
(no subject)
Guy from Seattle team we've been working with showed up today at work; no matter how much I'm generally comfortable working with remote teams (and I…
-
(no subject)
Didn't sleep well. Long day of work. Dinner with akiva at hanamichi.
-
(no subject)
Sean's back in town --- good fun working with nonremote teammates.
- Post a new comment
- 3 comments
- Post a new comment
- 3 comments