24. Bringing it all together

We are on the home stretch. In this episode we complete the set of equations of Interacting Hopf monoids, the technical name for the equational theory behind graphical linear algebra. Afterwards, we will not see any new equations for some time. Instead, we will explore what this equational theory is good for. In the next few episodes I hope to convince you that I haven’t been wasting your time with all of this!

First—it’s been a while since the last episode—so let’s start with a little reminder of where we are in the story. In the last two episodes we discussed some of the ways that adding and copying interact with their mirror images. The upshot was that we identified the equations for adding

adding

and copying

copying

Both these situations are examples of special Frobenius monoids that we touched on in the last episode, with the additional bone equations. We saw that in any Frobenius monoid we can construct cups and caps, and in graphical linear algebra there are two equations that relate the black (copying) versions with the white (adding) versions. Basically, if you stick an antipode on one, the antipode can disappear as long as the colours are also inverted.

cupscaps

There are just a couple more equations to talk about. They are a bit different from those that we have seen so far because they involve the integers, for which we have constructed a syntactic sugar in earlier episodes.


Let’s turn our minds back to Episode 9 where we defined the sugar for natural numbers. The definition was recursive:

nats

Later, in Episode 18 where we met the antipode generator, we extended the sugar to all of the integers by letting:

negs

For any integer w, our intuition for the behaviour of the w sugar was that the input arrives on the left, gets multiplied by w, and is output on the right.

sugarbeh

The relational interpretation gives a relation of type Num  ⇸  Num that consists of pairs (x, wx) for all numbers x. It’s worthwhile to keep in mind that this is not a definition: it follows from the way we defined the relational interpretation of all of the basic components; the relation for the w sugar then comes out when we compose all of the component relations, following the recursive recipe of the sugar.

Since by now we have all of the mirror image generators, we can also construct the mirror images of the syntactic sugars; i.e.

backwardsscalars

With the functional intuition, it makes sense to think that in these backwards sugars the data flows in the opposite direction: the input arrives on the right and the output exits on the left. But relations don’t care about input-output assignments: given an integer w, the meaning of the backwards sugar is just the relation Num  ⇸  Num with elements of the form (wx, x), where x is any number.

The new equations of this episode tell the story of how, in graphical linear algebra, the forwards sugars interact with their mirror images.


First, let’s consider the situation where two sugars for w, one forwards and one backwards, meet head-on. Like this:

headon

When w=0, the above diagram is

0headon

The relation represented by this consists of all possible pairs of numbers (x, y), in other words, this is the cartesian product Num×Num.

What about the case when w≠0? For the sake of concreteness, let’s consider the case w=2 and work out its relational interpretation.

2cospan

Here we are composing the relation with elements of the form (x, 2x) with the relation that has elements of the form (2y, y), where x and y are arbitrary. After composing these two relations, the result consists of pairs (x, y) that satisfy 2x=2y. If our universe of numbers Num is the integers, the rational numbers (i.e. fractions), the real numbers or the complex numbers then 2x=2y implies x=y, since we can simply cancel out the 2. Which means that the following is true, relationally speaking.

cospaneq2

In fact, the same argument can be repeated for any nonzero integer w. This leads to the following equation, or more accurately, the following equation schema since it can be instantiated for any nonzero integer value of w, which here plays the role of a metavariable.

headoneq

By adding these equations, what we are really doing is assuming that our universe of numbers does not have zero divisors. A zero divisor is a nonzero u, such that there exists a nonzero v with uv = 0. In the integers and many other familiar number systems, including the ones mentioned before, these things don’t exist. We can use this fact—the nonexistence of zero divisors—as follows: if wx = wy then wx – wy = 0, and so w(x-y) = 0 (†). Since w ≠ 0 and there are no zero divisors, (†) implies that it must be the case that x – y = 0, that is x = y.


We considered the case where the w sugars meet head on, so let’s now take a look at what happens when they point outwards, like so:

wspan

Starting with the case w=0, we have:

0span

which gives us the singleton relation consisting of (0, 0).

Now let’s take a look at w=2 again and work out the relation.

2span

Her we are composing relations with elements of the form (2x, x) and (y, 2y), where x and y are arbitrary numbers. Composing the two enforces x=y, so we are left with the relation that has (2x, 2x) as its elements. If Num is the integers then this means that we are looking at a relation that contains pairs of equal even integers. If Num is the rationals, the reals or the complex numbers, then we can obtain all pairs (x, x): this is because, given any number x, we can divide it by 2, and compose (x, x/2) with (x/2,x). So, if division by 2 is something that we can do in our number system Num then the following is true in the underlying relational representation.

2spaneq

Division by non-zero elements is something that can be done in any field, and the rationals, the reals and the complex numbers are three examples. So, if Num is a field then the following equation schema is supported by our relational intuitions.

pointouteq


 

That’s it. We’ve covered all of the equations of graphical linear algebra!  We can now stop talking about intuitions and work purely algebraically, using diagrammatic reasoning within this equational system.

Let’s take a little bit of time to stare at the equations and appreciate them in their full glory. Here they are, all in one place.

ih

In our papers, Filippo, Fabio and I have been calling this system interacting Hopf algebras, or interacting Hopf monoids. The reason for the name is that the system of equations in the Adding, Copying, Adding meets Copying and the Antipode boxes is sometimes called a (commutative) Hopf monoid, as we discovered in Episode 18. The system of equations in the Addingop, Copyingop, Addingop meets Copyingop and Antipodeop boxes is a Hopf monoid again; the mirror image. And the remaining four boxes: Adding meets Addingop, Copying meets Copyingop, Cups and Caps, and Scalars tell the story of how these two Hopf monoids interact.

Altogether, there are quite a lot of equations, but there is a lot of symmetry going on. There’s quite a bit of redundancy as well: some combinations of equations imply the others. I will eventually write an episode that focusses on redundancy, but we are not going to be slaves to Occam’s razor. Let’s be honest with each other; conciseness and minimality was never a strong point of this blog.

The presence of symmetries is an important and interesting issue. Let’s account for two different ones. First, take any equation and invert the colours, swapping black and white; you get another fact, provable in the theory. This means that whatever theorem you prove, using diagrammatic reasoning, its “photographic negative” version is also true. We will take advantage of this symmetry quite a bit: it reduces your workload by half! Second, take any equation and look at in the mirror; it is also an equation of the system. So, whatever fact you prove with diagrammatic reasoning, its mirror image is also a theorem.

I want to leave you with one more observation. We have spent so much time talking about adding and copying… but looking at the equations, which ones tell us that the black structure means copying and the white adding? Well, we have just admitted that inverting colours is a symmetry of the system, so the answer must be… none of them. Copying and adding satisfy the same equations, and interact in the same ways. I still find this fact to be rather beautiful and mysterious.

There are several other, important properties satisfied by this set of equations, but I’ll save them for when they are needed. Most importantly: we know that the theory of Hopf monoids is isomorphic to the prop of integer matrices, so to give a diagram in that theory amounts to writing down a matrix of integers. So what linear algebraic notion does the theory of interacting Hopf monoids correspond to? I will leave you with this cliffhanger and save the explanation for the next episode, where we will also understand the algebra of fractions, diagrammatically.


 

I was in Cali, Colombia at the Javeriana university from the 25th of October to the 1st of November. I presented a tutorial on Graphical Linear Algebra at the ICTAC summer school, gave a talk at the Developments in Computational Models workshop, attended a whole bunch of inspirational talks, met up with some old friends and made some new ones. Everything was superbly organised and a lot of fun. Here’s a photo from my tutorial, where I seem to be demonstrating applications of graphical linear algebra to martial arts.

ictac

My favourite scientific talk was probably a mini tutorial by Jean-Raymond Abrial, also at the summer school, on doing maths with an “engineering” mindset. The idea is to capture the kind of process that goes on in a mathematician’s brain when writing a proof and formalise it (in a software tool!) as an instance of refinement, a software development concept popularised by Abrial and at the core of his B-method. Here’s a photo from that tutorial.

Abrial

Abrial demonstrated this by proving the Cantor-Schröder-Bernstein theorem in the Rodin tool. Here’s what I think, but first a disclaimer: I’m a theorem-prover/automatic-proof-assistant newbie. Also, I have to admit that I’m not a huge fan of the B-method in general: using sets and relations as building blocks is a bit too untyped and too low-level for my taste. Nevertheless, Abrial’s demonstration was intriguing. His discovery and distillation of the process of refinement in a mathematical context was quite compelling, and seemed to closely resemble “real mathematical thinking”, whatever that is.

At some point during the conference I was bragging to some of the attendees that I never suffer from jet-lag. Well, that was true up until now:I wrote about half of this episode after 1:30am, during a series of relatively sleepless nights spread over two weeks after returning from Colombia. I had jet-lag with a vengeance: I spent my days in a sleepy daze and got all my energy after midnight. Thankfully I’ve finally got back to normal over the last few days.

Continue reading with Episode 25 – Fractions, diagrammatically.

4 thoughts on “24. Bringing it all together

  1. Great episode! This `photographic inverse’ stuff and the fact that nothing seems to distinguish adding from copying equationally are indeed intriguing. Have you worked out some examples to get some intuition about what this says syntactically?

    Liked by 1 person

    • Thanks! On matrices we already know that combining the two symmetries (i.e. swapping the colours *and* flipping sides) gives you the notion of transpose matrix. And yes, the two colour-swap symmetry has a nice linear algebraic interpretation… I will explain that in a few episodes.

      Like

Leave a comment