omg generating functions are just the most beautiful thing ever.

Imagine you have a model of a graph structure over time. Every time step a vertex is added, and with probability δ, also a random edge is added to the graph, with its endpoints uniform randomly chosen among the existing vertices. Imagine also that you're crazy and, even after you've satisfied the δ probability that you're going to try to add an edge, there's a further probability σ that you add a

*broken*edge that just has two half-edges dangling from the two vertices it would have connected. (we actually have a motivation for this, though it sounds crazy)

What you

*can*do is write down recurrences that describe just how many connected components you should expect to find, asymptotically, that have n vertices and m half-edges in them. These recurrences take up about half a page. Nested sums, the works.

Alternatively, you multiply all these guys by nx

^{n}y

^{m}, sum them up, and like magic, all of the nested sums turn out to be what you

*would*have gotten by doing algebraic and analytic manipulation of formal power series. In other words, that whole half-page of information is expressed perfectly well by a single line:

g = x + 2δg

_{x}x(-1 + σy + (1-σ)g)

That -1 there describes the fact that the number of n-sized m-half-edged (call them "(n,m)") components tends to decrease by the process of becoming other kinds of components; this happens when one end of an edge lands in one. The σy describes the process of a new (n,m+1) component being created by a half-edge being added to an (n,m) one. The (1-σ)g describes components joining together, and hides

*four*nested summations ordinarily required to say everthing about the sizes and half-edges adding up right. The x on the left is the fact that we're adding a new singleton vertex (a (1,0) component) every time step. That's it! Math is so fucking beautiful sometimes.