## March 29th, 2007

### (no subject)

pointed me to this paper on something they call "weighted dynamic programming". I haven't finished reading the paper, so I dunno if it's really "dynamic programming" in the sense I'm used to.

The upshot is it's a little domain specific language that lets you write in a nice declarative style — one that looks a lot like logic programming — some apparently common algorithmic idioms in natural language processing. From their website, here is an example from the tutorial showing how to compute shortest path in a directed graph:

pathto(V) min= edge("Start", V).
pathto(V) min= pathto(U) + edge(U,V).

The first clause means: "the length of the shortest path to V is no longer than the distance from the start vertex to V". The next clause means: "the length of the shortest path to V is no longer than the distance to U plus the length of any edge from U to V". The compiler then turns this into a sensible version of Dijkstra's algorithm that you'd expect.

I think what's really going on here is programming in a monad I haven't seen before: the free-rig monad. The nondeterminism in ordinary logic programming is more or less the free-monoid (i.e. list) monad since we do typically notice the ordering of clauses. We have one kind of "nondeterministic split" in our hands, namely the ability to write multiple clauses. In Dyna, however, we have another one, though; in the case of the rig (R≥0, min ∞, +, 0) it's the ability to spawn of two computations and then add their answers, as in pathto(U) + edge(U,V).

The point is that if we have a free-forgetful adjunction F -| U (which induces the monad UF) and a type UB that is the underlying type of some rig (like the type R≥0 is the underlying type of the rig (R≥0, min ∞, +, 0)) that we can use the counit at U, i.e. εU : UFUB → UB to get from the type of monadic computations returning UB back to UB. I think the benefits the Dyna compiler claims could be construed as a deforesting-away of the rig-expression-tree that would otherwise be generated.