September 22nd, 2012

beartato phd

(no subject)

Back to nyc today. About to get on a train in like half an hour. Really enjoyed seeing everyone that I did manage to see! Sadly missed a few people I meant to, but there is only so much time.

Still thinking more about this logic where all the propositions are labelled over a boolean algebra, struggling to complete a heuristic analogy
trivial algebra : plain intuitionistic logic
commutative monoid : linear logic
monoid : ordered logic
boolean algebra : ???
(altough I'm not really putting all the pertinent information in this little table: e.g., it's not just the choice of the commutative monoid that gives linear logic, but also the choice of how you encode connectives into monoid operations. Still...)

Anyway it's so maddeningly close to looking like bunched logic, even though I know it's not quite, because it satisfies things like A & top |- A * A.
beartato phd

(no subject)

On train now.

To recap, I'm thinking about a logic where every judgment (both on the left in Γ, on and the right) looks like "A [t]", a proposition A together with a label drawn from the language of boolean expressions over some variables x:
s, t, u ::= x | t + t | ~t | 1 | 0

the "+" here indicates OR, and the ~ indicates NOT.

There are some boring rules that mostly do what you expect and propagate the label of a proposition to all child-propositions it creates, like
Γ, A [t] ⊢ B [t]
Γ ⊢ A ⇒ B [t]

So we can build propositions out of all the usual intuitionistic connectives, including first-order quantifiers over the domain of boolean terms.

The rule for atoms requires equality on labels:
Γ ⊢ t = s
Γ, a [t] ⊢ a [s]

We can decide equality on labels without resorting to trying to solve an axiomatized equational theory on the language of expressions above; instead we just consider the set of all possible 0/1 assignments for the free boolean variables that satisfy all the equalities in the context, and see if the equality in the conclusion always holds, evaluating all relevant boolean expressions in the usual way.

The identity and cut principles look like so:
Γ ⊢ t = s
Γ, A [t] ⊢ A [s]

Γ ⊢ A [t]   Γ, A [t] ⊢ C[s]
         Γ ⊢ C [s]

The fun rules are these, for modalities ⊞ and ⊟ (which happen to be positive and negative from the point of view of focusing) which adjust the label on propositions:
Γ ⊢ A [t+s]
Γ ⊢ ⊞s A [t]

Γ, A [t+s] ⊢ J
Γ, ⊞s A [t] ⊢ J

Γ, t = x+s ⊢ A [x]
Γ ⊢ ⊟s A [t]

Γ ⊢ t = u+s   Γ, A [u] ⊢ J
Γ, ⊟s A [t] ⊢ J

Conveniently, in addition to the focusing mnemonic just mentioned, the interpretation of ⊞s is that it "adds" the expression s to the current label, and ⊟s "subtracts" it. It's sort of like saying that
s A [t] = A [t + s]
s A [t] = ∀x. t = x + s ⇒ A[x]

The reason I think of ⊞ as positive and not ambipolar is because I think I might want to impose the requirement that every label in the context is greater-than-or-equal-to the label on the right, i.e. there must exist an expression u such that s + u = t for Γ, A [t] ⊢ C [s] to be a valid sequent. This would mean that the ⊞-right rule, since it increases the label on the right, would have to have a context-clearing effect, making it synchronous.

Anyway I already conjecture that if I define
A * B = ∃x.(⊞xA ∧ ⊞~xB)
A -* B = ∀x. ⊟x(⊞~xA ⇒ B)

then I will get some interesting connectives vaguely like BI's fuse and magic wand. I guess this is a very mild and fuzzy conjecture, since who knows what "interesting" means. But also just now I realized that there seems to be no reason I can't define
A & B = ∀x.(⊟xA ∧ ⊟~xB)

(because intuitionistic conjunction is ambipolar!) and apparently get a negative conjunction that is simultaneously distinct from both * and the intuitionistic conjunction that the logic inherits from plain intuitionistic logic! This is really weird! A logic with three conjunctions at once? A bunched-ish logic with a conjunction that doesn't distribute over disjunction? Crazy!

It has some things in common with & from linear logic. I can in fact get A & B ⊢ A and A & B ⊢ B, for the reason that I can just choose x to be 0 or 1 on the left, and observe that ⊟0 is the trivial modality. (so too is ⊞0, but I can't get A ⊢ A * B, because I have a conjunction on the right, and I can't cough up (as opposed to discard) a B)

However the right rule for & seems to diverge from the one we expect it to have in linear logic, which kind of makes sense, since ∧ already took it! Just because I know Γ ⊢ A and Γ ⊢ B separately doesn't mean I can necessarily prove them "at once" in the way that this & requires me to. Wellll I'm not sure what I mean by "at once", really, but I can observe that A ∧ B ⊢ A & B fails, and indeed it fails in the same way that A * B ⊢ A ∧ B does. And the technical trick I barfed up to ensure exactly that failure was what started me down this path!

So in fact I conclude that since my right rule has been weakened, my left rule must be somewhat stronger, and indeed it is. I'm not only able to choose 0 or 1 when decomposing the ∀; I could choose some other boolean expression based on the free variables in play. This & lets me make a synchronous choice, furthermore a choice that can be sensitive to which "possible world" it's in.

Anyway, weird stuff, dunno what it means. I especially don't know what semantic analogue this & might have back in graphs-world, if any. Maybe it requires a generalization to something other than graphs.


Also what the hell is ∃x.(⊞xA ∨ ⊞~xB)? I was tempted to say ⅋, but it is positive, while ⅋ is negative.