The idea is to imagine an intuitionistic logic with some kind of modal operator ◯ that means something like "at the next time step" in a temporal modal logic that has only two times, call them 0 and 1. If you try to go to the next time step when you're at time 1, you just stay there --- so ◯, in this case, is really just forcing you to go to time 1.
I was hoping to model the "funny product" monoidal closed structure on Grph that I had thought about here and here and here. I still have this intuition that "the logic of Grph" should look something like BI, (at least the interpretation I have in mind appears to be sound for BI) but I have no traction on a completeness proof at all. I stared at the ol' completeness proof for intuitionistic logic interpreted in sets with respect to a Kripke frame for a while, but I don't see how to generalize it at all.
So I surmised that maybe I could coax out of the above simple temporal logic (almost seems too trivial to even call it "temporal", but whatev) a correspondence
(a) between proofs of the proposition A and edges in the interpretation of A
(b) between proofs of the proposition ◯A and vertices in the interpretation of A
bearing in mind the intuition (gleaned from thinking about Grph as a presheaf category over the category T with two objects "Vertex" and "Edge" and two morphisms src, trg : Vertex → Edge) that edges "evolve" over "time" into vertices. (Why not the reverse, vertices evolving into edges? Because of the op in Grph = SetsTop)
Aaaand I was sort of ignoring the fact that there should be two distinct ways of edges evolving into vertices, hoping it would only affect proofs and not provabilities.
Anyway the definition of A ⊗ B as (A ∧ ◯B) ∨ (◯A ∧ B) is meant to capture the semantic definition that says an edge of A ⊗ B arises either from an edge of A and a vertex of B, or a vertex of A and an edge of B. And if I hit A ⊗ B with ◯ and note that ◯ commutes with everything in sight (and is idempotent) I find that ◯(A ⊗ B) ⊣⊢ ◯A ∧ ◯B, which corresponds to the fact that the vertices of the tensor product are the cartesian product of the vertices of A and B.
But the whole game is lost by noticing that A ⊗ A = (A ∧ ◯A) ∨ (◯A ∧ A) ⊢ A. So that's a bummer. I'm not sure what I can do to repair that.