Two things, one fairly simple and accessible, the other less so:
Over dinner a while ago, tom7 mentioned taking like high school physics and being taught about how to compute resistance of resistors in serial and parallel, but not how to do it for general resistor networks — and indeed his teacher seemed to not even know how to do it and referred him mutely to a college physics textbook.
So I hope you're asking already "well, how do I do it?" unless you are like ECE or in the appropriate part of theoretical CS and already know it. And if you are one of those people and can tell that I've got something wrong below, do tell me.
The standard sort of reductionist way is you invoke Kirchoff's law (specifically the one that says the amount of current flowing into a node is equal to the current flowing out) Ohm's law (which tells you how current, voltage, and resistance are related) and get a big ol' matrix of equations for all your unknowns and just solve it.
The cooler thing about it (which is maybe not a fundamentally different method, but a cooler way of thinking about it) is that it all reduces to the Laplacian and harmonic-function stuffrweba was talking about recently.
Imagine on the one hand a stupid game where you're doing a random walk on a graph, but some vertices end the game if you reach them and give you a certain score. I could ask, well, what's the expected value of my score if I'm plopped down at a certain place in the graph? Think of this expected-score answer as a function from vertices to real numbers. It's harmonic in the sense that every (non-exit) vertex's value in this function is equal to the average of its neighbors. This follows from sort of inductively thinking about playing the game: "well, what's my score likely to be, starting from here? It's either going to be whatever my expected score would be if I had started at that neighboring vertex, or that one, or that one, and I choose which one I go to uniformly."
And on the other hand, voltages in a resistor network (where some nodes are forced to be at a certain voltage) are also a harmonic function. For the special case that all resistors in it are the same resistance R, anyway, which easily generalizes. For consider one (non-tied) node x and its neighbors y1, ..., yn. Ohm tells us that the current Ii flowing from yi to x is equal to (Vi-V)/R if Vi is the voltage at yi, and V is the voltage at V. (it'll be negative if current is flowing to y1 from x) Kirchoff tells us that the sum of these currents is zero. Doing some algebra, we get that V is the average of the Vi.
In one sentence: The expected score starting at a given node in a random-walk game with scored exit nodes is the same as voltage at a given node in a resistor network with voltage sources, and the reason is that both random walks and physics require harmonic functions for these quantities.
Walking home after the bus ride from Sherbrook last night, I realized an immense simplification of this category theory nonsense, which made me feel awfully dumb for worrying so much about how to define something that turned out to be free strong ω-categories — not least because I don't even need freeness!
We know what a strong n-category is; it's just a category enriched over (n-1)Cat, and strong ω-categories are just the "limit" of this definition. In such things, compositions are associative on the nose, and we get associativity of pasting diagrams very naturally out of enrichment from functoriality of composition. So I'm tempted to make the following definitions:
Two n-cells x, y in a strong ω-category of the same source and target are equivalent if there are (n+1)-cells f : x → y, g : y → x such that fg and gf are both equivalent to the appropriate identity. Note this is a kind of coinductive definition. If I were to be careful I would talk about an infinite binary tree of equivalence pairs satisfying a certain condition. Or maybe you could do it inductively on finite trees whose leaves are identities.
A weak ω-category is a strong ω-category C together with a subset X of the cells of C such that for every c ∈ C there is an x ∈ X equivalent to c. A weak ω-functor from (C,X) to (C',X') is a strong ω-functor from C to C' that preserves the distinguished subset.
I haven't thought through how to define higher morphisms on such things, but already weak associativity properties seem to just pop out of these definitions. This, and the almost, er, asinine simplicity of these definitions is enough to make me actually believe that this definition is either genuinely interesting, or totally totally wrong.
One tricky thing is that to get the Baez-Dolan "periodic table" you have to demand not that there is one object (resp. arrow, 2-cell, etc.) but rather that all objects (resp. arrow, 2-cell, etc.) are equivalent. This seems strangely right to me, actually.