Here's a stupid simple open-ended question about statistics.

Suppose I've got a bunch of boolean random variables X_n. They could be totally independent, P(X_i ^ X_j) = P(X_i)P(X_j) when i!=j. Each X_i just fires with some probability and doesn't care about the other guys. Or, they could be maximally correlated, so that P(X_1 ^ X_2 ^ X_3 ...) = p: there's a fixed probability p, and when that chance comes up, everybody fires together.

Let me in fact assume that in any event there's symmetry across the different X_i, like suppose I'm just handed p_1, p_2, p_3, ... such that

P(X_i) = p_1
P(X_i ^ X_j) = p_2 (when i!=j)
P(X_i ^ X_j ^ X_k) = p_3 (when i,j,k are all != pairwise)

and so on.

(Given any such specification of ps subject to some inequalities, I can reconstruct the full joint probability distribution.)

So the the independent situation is

(p_1, p_2, p_3,...) = (p,p^2,p^3,...)

and the completely dependent situation is

(p_1, p_2, p_3,...) = (p,p,p,...)

Question: Is there any nice canonical way to interpolate between these two guys? A single knob to turn, rather than the n different knobs p_1, p_2, p_3, ...?

---

Hm, a possible answer that might be what I want: a Boltzmann distribution where the energy function comes from summing up each graph edge on the complete graph, with one value if the boolean assignments at the vertices across that edge are different, and another value if they're the same.

this puts (p_1, p_2, p_3,...) = (p,p^2,p^3,...) in the middle of the spectrum I think, and (p_1, p_2, p_3,...) = (p,p,p,...) at one end, and the "we're strongly dependent, but anti-correlated" picture (p_1, p_2, p_3,...) = (p,0,0,...) at the other end of the spectrum.
Tags:
• Post a new comment

#### Error

Anonymous comments are disabled in this journal

default userpic