After all the hype about relations in the last two episodes, it’s high time to use them to do something interesting. Let’s start by quickly reviewing the relational interpretation of our addition and zero generators. First, the addition:

Using our updated intuition, we are thinking of this as representing a relation of type **Num** × **Num** ⇸ **Num**, defined as the set of all pairs

where x, y and z are any numbers that satisfy the condition x+y=z. So, for example, this relation contains the following three elements:

The zero generator

also has a relational interpretation. Its type is **{★}**** **⇸ **Num**. The reason for this is that **Num**^{0} is actually a singleton set—that is, a set containing precisely one element—and, since the name of this element does not matter, we may as well call it ★. The relational interpretation of the zero generator is the singleton relation

{(★, 0)}. (‡)

The fact that **Num**^{0} is a singleton is related to a fact memorised by everyone in high school: we all know that k^{0} = 1 for any integer k. We will not spend a lot of time getting very deep into this now; let me just say that it’s one of the things that category theory can explain really well, maybe even better than people on Quora. I will just outline the basic idea—and if the following is too technical, don’t worry, it’s not important for our story.

The cartesian product of sets is an example of a general phenomenon called a limit. Binary, tertiary, and n-ary cartesian products (sets of pairs, triples, n-tuples, and so forth) are particularly simple instances: they are limits indexed by very simple categories.

For example, binary cartesian products are limits indexed by a category with two objects and only identity arrows. Such very simple categories, where there are no non-identity arrows, are sometimes called **discrete**.

Then, n-ary cartesian products are limits indexed by a discrete category with n objects. So what is a zero-ary cartesian product? A limit for the discrete category with no objects (i.e. the empty category) has its own name: it’s called a **terminal object**. And, in the category of sets, any singleton does the job of being terminal.

One of the things that you can do with relations that you can’t, in general, do with functions is **turn them around**. The result is sometimes called the *opposite* relation. We can use this idea to make sense of the following new generator, which looks like the mirror image of addition:

We will be thinking of this strange new beast as representing “addition backwards”. Although that does not make much sense as a function from left to right, it makes perfect sense as a relation; it is the one that consists of pairs

where x, y and z are any numbers that satisfy x+y = z. Of course, this is just (†), but backwards.

We can also reflect the zero generator to get a new generator that looks like this:The relation this generator represents is, not surprisingly,

{(0, ★)}

which is just (‡) backwards.

As we do more graphical linear algebra, you will see that these mirror versions of addition and zero are extremely useful to have around!

Back when we were talking about ordinary addition, we identified some equations that concern the addition and zero generators. These were (Comm), (Unit) and (Assoc), and here they are again:

Since the backwards addition is still addition, it satisfies the same equations, but backwards.

It’s useful to notice that these end up looking a lot like the equations for copying; here they are again, as a reminder:

In fact, the equations for adding backwards and copying are exactly the same, apart from the colouring of the nodes. The colours make sure that we don’t confuse backwards addition and copying, since our intuition is that they stand for different relations. Also, again in order to avoid confusion, we tag all the new equation names for with op, for *opposite*. We will come back to copying, viewed relationally, in the next episode.

The equations for backward adding are not controversial. In fact, they are rather boring, and we have not really discovered anything new about addition so far. The interesting, new stuff comes when we think about what happens when addition sees itself in the mirror.

So what happens when we connect the adding generator to its mirror image? Let’s focus on the most interesting examples first, and concentrate on the relation represented by the following diagram:

We could translate everything back to first principles and compute the composition as in **Rel**, but just by looking at the diagram it’s clear that there are three variables involved; the numbers on the three wires that connect as arguments to the two additions. Let’s call these x, y and z and label the wires accordingly:

So, the relation consists of pairs

for all choices of x, y and z. For example, taking x=1, y=-1 and z=3, we see that

is in. Another example, taking x=3, y=2, z = -5 gives us

Looking at these examples, it seems that there may be another way of describing these pairs: those in which if we add the components, we get equal sums. Indeed, in the examples above we have 1+2 = 0 +3 = 3 and 3+-3 = 5 + -5 = 0.

So let’s see if ① is actually the same relation as ② below, which consists of all pairs

where p+q = r+s. Incidentally, ② is the relation represented by:

Just by looking at ① and ②, clearly every ① is an instance of ②, since summing up either of the two components of ① gives x+y+z. Then to show that ① and ② are the same relation, it’s enough to to show that every instance of ② is an instance of ①.

So lets take a closer look at ②. Since p+q = r+s, we can express each variable in terms of the other three, so in particular q=(r-p)+s and r=p+(q-s). This is almost in the form required by ①, we just need to show that r-p = q-s and indeed:

r-p = p+(q-s)-p = q-s.

That finishes the job: ① and ② denote the same relation. These arguments give us the justification for introducing the following intriguing equation.

In fact, the argument above can be recycled to show that also

since the calculations involved are clearly symmetric.

A nice way of memorising ③ is that it says “Z = X”: the diagram on the left looks a bit like the letter Z and the one on the right looks like an X. Then, the mnemonic for ④ says, for similar reasons, that X = S. Together, we have Z = X = S, and so in particular Z = S.

These three equations are quite famous; they are called the **Frobenius equations**. It’s a little bit redundant to keep all three, because it turns out that all three of (Frob), ③ and ④ are equally powerful: any one one of them lets you prove the other two, using diagrammatic reasoning. So, out of the three, we will just keep (Frob) because it is the most symmetrically pleasing.

For example, below there is the proof of ③, assuming that we have (Frob). I’ll leave the other five implications for you as nice exercises!

By the way, structures satisfying these equations, repeated below in an anonymous, gray mode, are often referred to as (commutative) Frobenius monoids.

Ferdinand Georg Frobenius was a German mathematician, active in the late 19th and early 20th century. He was very good, but he also happened to have an extremely cool last name. These two facts combined mean that a surprisingly large number of mathematical notions are called after him. He also had a pretty inspirational beard, as verified by the following photo.

The Frobenius equation (Frob) is probably most famous for the role it plays in the impressively named and difficult-sounding field of 2-dimensional topological quantum field theories (2D TQFT – even the acronym is scary!). There’s a very nice book on the subject by Joachim Kock called *Frobenius algebras and 2D topological quantum field theories*. You can’t say that the title is misleading. And 2D TQFT, despite the name, is actually not that difficult to get your head around. A bit like the Sierpinski space.

Frobenius didn’t actually discover the Frobenius equation. As is often the case, there is a bit of controversy about who exactly thought of it first. Mathematicians can get quite uppity about this sort of thing. Some mathematics message boards descend into mini flame wars as people argue about what exactly was written in the corner of some blackboard at an obscure conference somewhere off the coast of Britain in the mid 1960s. I guess that professional mathematicians are amongst the very few people who actually remember any details of that particular decade.

My feeling is that—as is not uncommon with good maths—it’s likely that number of people thought of it at around the same time. The Frobenius equation was in the air. And it’s going to be a favourite of ours on this blog.

Having said all that, one of the earliest people to realise the importance of the Frobenius equation was Bob Walters, who I’ve talked about in Episode 12. If you’re interested in some of the history, you can take a look at a blog entry of his here where he wrote about some of its history. Note that Bob talks about the equation X=S, our equation ④. But as we’ve discussed before, it doesn’t matter which one of (Frob), ③ or ④ you consider.

There are two more equations that show interesting ways in which addition interacts with itself in the mirror. We have already seen the first one in the last episode, where it featured as one of the equations of the special bimonoid, the structure isomorphic to the PROP of relations. It is none other than the “special” equation. Unfortunately the name “special”—terrible as it is—seems to have caught on, so we are stuck with it. Here it is:

Let’s convince ourselves why it makes sense in our context of addition interacting with its mirror image. Doing the simple calculation, the left diagram represents the relation

{ (x, y) | there exist numbers u, v such that u+v = x and u+v = y }.

Now, given any element (x, y) in this relation, it’s easy to see that x = y, since they are both equal to u+v for some u and v. Also, for any number z, (z, z) is in the relation because we can take, for example, u = z and v = 0. These two arguments combine to mean that the relation is actually the identity relation

{ (x, x) | x is a number }

which is, of course, the relational meaning of the identity wire. So, long story short, (Special) is compatible with our intuitions.

The second equation, (WBone), is pretty self evident: it’s the white version of the bone law from Episode 8. Both of the diagrams in (WBone) represent the relation that contains the single element (★,★).

The table below summarises the equations that we have discussed in this episode, which capture the various ways in which addition interacts with its backwards twin.

Unfortunately, the rate of articles has slowed down somewhat recently. This is mainly due to the time of the year: it’s the start of semester, so teaching, supervision and related activities are taking up a lot of my time. Also, coincidently, this is the time of year when there are a few paper deadlines. And on top of all that I have a few pretty exciting research leads to chase. I’m sure that some of these will make it on the blog eventually.

I hope that the pace will pick up again in October. If you’ve stuck with the series so far, let me just say that we are getting very close to the really interesting stuff!

Continue reading with Episode 23 – Frobenius Snakes and Spiders.

As always, thank you very much for this series and all the blood and tears you put into each post! Loving it!

LikeLiked by 1 person

obviously like your web site however you need to

take a look at the spelling on quite a few of your posts.

A number of them are rife with spelling problems and I to

find it very troublesome to inform the truth on the other hand I’ll surely come back again.

LikeLike