# Let’s Talk About Group Isomorphism

“Let’s talk about group isomorphism!” Said no one ever.

Group isomorphism is obscure, complicated, and technical. It’s also one of the coolest ideas I’ve ever encountered, one of those real wow! moments that changed the way I think about math, the universe and everything. And, I believe, it’s very relevant to AI.

So here’s the deal: I’ll do my best to lead you through the swamp, and in return, you’ll get a new perspective on mathematics – and maybe even the nature of intelligence. Agreed?

We’ll start with something simple. Or rather, two somethings: addition and multiplication, + and ×. Our old arithmetic friends. Can’t get much simpler than these guys, right?

6 + 8 = 14

3 × 7 = 21

Have you ever thought about how similar addition and multiplication are? Both of them take two numbers, do something mathy to them, and spit out another number.

In fact, the similarities don’t stop there. Both groups are also associative, meaning we can throw in parenthesis and it doesn’t change the answer:

(1 + 2) + 3 = 1 + (2 + 3) = 6

(2 × 3) × 5 = 2 × (3 × 5) = 30

That may not seem like much, but it’s a property subtraction doesn’t have.

Addition and multiplication also both have an identity element. For addition it’s zero, for multiplication it’s one. Either way, it doesn’t change the identity of whatever you pair it with.

5 + 0 = 5

7 × 1 = 7

What’s more, addition and multiplication both have the idea of an inverse. For any number, you can pair it with its inverse, and get the identity element.

18 + (-18) = 0

37 × (1/37) = 1

Addition and multiplication really are a lot alike, aren’t they? So much alike, in fact, that you almost get the feeling that at some deeper level, they’re doing the same thing, just with different symbols. Like they’re two different languages, and all we need is a translator…

In fact, you can translate (or “map”) between multiplication and addition. The key is the natural log function, ln(x).

For those who aren’t familiar with it, the natural log function is a button on any scientific calculator, and a function in Excel too (just type “=LN(5)” for example). If you want more details about what it actually means, Google is your friend. For now, what we care about is that it will translate between multiplication and addition.

What do I mean by that? Let’s see an example.

3 × 5 = 15

Well, ln(3) = 1.098612, and ln(5) = 1.609438, and ln(15) = 2.70805. These numbers may look like random garbage, but they have a remarkable property:

1.098612 + 1.609438 = 2.70805

(Ignoring rounding errors, of course.)

In other words:

ln(3) + ln(5) = ln(15)

There’s our translation.

Say we wanted to calculate 6 × 7, but we’re allergic to multiplication. We just can’t do it, for whatever reason. But we can do addition, and that’s all we need.

All I have to do is “translate” our multiplication problem into the language of addition, using the natural log function:

ln(6) = 1.791759

ln(7) = 1.94591

Now we’re in the world of addition, and I can do addition!

1.791759 + 1.94591 = 3.737669

We’ve got our answer! We just need to “translate” it back into multiplication world.

We know that ln(answer) = 3.737669. What we need is the ln function in reverse, a sort of un-natural-log, to go back to multiplication world. In mathematical terms, we’re looking for the inverse function, the function that will undo ln(x). As it happens, the inverse function for ln(x) is e^x, so we take e^3.737669 and get…

42, the answer to our original multiplication problem, 6 × 7.

Roughly speaking, addition and multiplication are known as groups in the mathematical sense. And because they are fundamentally the same, they are called isomorphic (“iso” means same, “morph” means form). Group isomorphism.

To me, it’s fascinating that addition and multiplication – which seem totally different on the surface – are somehow the exact same thing “under the covers.” Any result you derive for one operation can be “translated” to the other. Our world hides deeper symmetries than we suspect.

Tomorrow, I’ll show how some of those deeper symmetries apply to AI.

Questions?