I swear I must have read the Wikipedia article on Closed and Exact differential forms before, but only this time did it sink in that cohomology can actually be about pairs of things being cohomologous. Up until now it had just remained a totally mysterious dual to homology, a thing that is already pretty mysterious to me still.
I helped Spoons solve a Twelf problem, and got him talking about his research a bit. It's pretty cool stuff. I always like work that digs down into what you thought was basic assumptions, and finds an extra stratum of generalization you could have used, but didn't. In his case, studying space and time use of functional programs suggests that there is a "stack frame use" monad that even Haskell (famous for putting big ol' locks on all your medicine cabinets and big ol' monads on all your most favorite tantalizing imperative features) lets you use without explicit staging. This is merely because it lets you write functions that aren't tail-recursive! Spoons's system forces a kind of A-normality discipline on you, but then lets you escape it by using the monad, documenting how many stack frames you potentially chew up in the type. I don't quite understand how it works apart from the extremes of "might use arbitrarily much stack" and "only does a tail call", but I'm eager to hear more about it.
I found out there is a whole boatload of papers by Nadathur and pals, arising from research on λProlog and Teyjus, on implementation techniques for representation of lambda terms. Titles like Tradeoffs in the Intensional Representation of Lambda Terms and A Treatment of Higher-Order Features in Logic Programming make me wonder why I haven't read them already.