Symmetric Properties
If you take a square and rotate it 90 degrees, you get back an identical square. We say the square is rotationally symmetric. Similarly, if you reflect it across an axis, you also get the same square. Since "reflectionally" isn't a word, we can more generally say it's "symmetric under reflection".
If you take any two numbers and compute x + y
, you'll get the same answer if you instead compute y + x
. We say that addition is commutative, but it's just as correct to say it's symmetric under transposition of terms. Subtraction is symmetric under "double-increment" (x - y = (x + 1) - (y + 1)) while multiplication is symmetric under skewing (x*y = 2x * y/2
).
In formal specification, we can express symmetries as properties of the system. Most commonly this is transpositional symmetry. If F(x, y)
is some predicate (boolean function), then F is symmetric if
all x, y | F(x, y) => F(y, x)
Where =>
is implication. Transpositional symmetry is so common that if you tell someone your business logic has a symmetry property that's usually what they'll think you mean.
Symmetry properties are useful in formal specification because they're common, easy to think of, and often influence the design. If you ever have a relation between two things of the same type, you can ask yourself if there's a symmetry relation you need to check for.
A more interesting example
Facebook has three kinds of graph relations between users. You can be Friends with a user, you can Invite a user, and you can Follow a user. We can define the predicate Friends(u, v)
to mean "u and v are friends".
Friends is symmetric: if I'm friends with you, you're friends with me. More formally:
FriendshipSymmetric = all u, v: User | Friends(u, v) => Friends(v, u)
This is one reason why we don't store many-to-many relationships as an array field in a user's SQL record. It's too easy then for the system to break symmetry, like if u.friends = {a, b, c, v}
while v.friends = {d, e}
. Having a separate relationship table makes it easier to keep things in order.
Follow is asymmetric. Follow(u, v)
tells us nothing about Follow(v, u)
.
Invites are interesting. If I invite you, you shouldn't be able to invite me. If you try to invite me, it should accept the invitation, making us both friends. Invites are antisymmetric: Invited(u, v) => !Invited(v, u)
. Antisymmetry properties are really easy to break. What if the invite service is eventually consistent and we invite each other simultaneously? You also have to worry about feature interaction. Say there's a way to merge two accounts: A invites B, B invites C, and then A and C merge. What happens then?
(Presumably in both cases we'd drop both invites and add a friendship relationship, but that'd have to be explicitly coded and it's something that'd be easy to miss.)
Symmetry optimizations
Symmetry is also important in optimization problems, like constraint solving. Consider the problem of conference scheduling: we've surveyed the attendees on their most-anticipated talks, and are now trying to assign talks to slots (rooms + times) to minimize the number of "favorites conflicts". If two rooms are indistinguishable, it doesn't matter whether we assign talk A to slot R1T1 and B to R2T1 or if we assign B to R2T1 and A to R1T1. Therefore we don't need to check any layouts involving the latter. Typically you'd optimize this by adding a "symmetry breaking" constraint, like "talk A must be in R1".
I use this a lot in model checking TLA+, too, which has a built-in symmetry-breaking feature. I don't know how common it is in other FM tooling. This paper seems to suggest it's applicable to SMT solving, at least.
If you're reading this on the web, you can subscribe here. Updates are once a week. My main website is here.
My new book, Logic for Programmers, is now in early access! Get it here.