Introduction
In this post, we aim to glean as much as we can about the characters of the symmetric groups (today we’ll be focusing on and
) using simple properties of characters.
In particular, here are the main tools we’ll be utilizing today and in upcoming posts where we plan to tackle finite matrix groups like for a positive integer
and prime
. In what follows, let
be a finite group and
the number of conjugacy classes of
.
- The irreducible characters
forms an orthonormal basis for the vector space of class functions
. Recall that a class function is one that is constant over when restricted to a single conjugacy class of
.
In particular, this meansif
and
if
for irreducible characters
.
- Let
for
be the irreducible representations of
. Then we have the sum of squares formula:
.
- Lastly, given two representations
and
of
, we can use the tensor product of vector spaces to create another representation – the tensor product representation – of
. We let the vector space of the new representation be
. Next, a
acts on
by the diagonal action:
, for all elementary tensors
an extend linearly to all tensors. Later, we’ll see the effect on tensor-ing two representations on the character.
First up, !
The most natural representation of (in fact, this hardly seems like a representation at all!) would to be let
be such that it sends
to its
by
permutation matrix and
, the three dimensional space. For instance,
and
Unfortunately, this is not an irreducible one: all the ‘s leave
for some complex number
. In other words,
is an invariant subspace of
(an invariant line, to be sure). The corresponding projection operator
onto
is
for all , and
That is, is the orthogonal complement of
under the usual dot product. By definition,
.
According to Maschke’s theorem, is also an invariant subspace. We can do a quick spot check:
for
. Setting
, we see that
and
, works and so
, as we expect.
The sub-representation is thus just the trivial representation: There just isn’t much freedom offered by a good ‘ol line. However, the degree 2 representation
is more interesting. Notice that
so
can be identified with
by
matrices. Fixing
as a basis for
, and then writing
for
as a matrix, gives us:
and
Of course, the identity goes to as usual. This representation, by definition, is irreducible: vectors here must have at least two distinct coordinates, which will switch places when one applies the relevant transposition to the vector from
.
The corresponding character (recall that the trace does not depend on the choice of the basis of
) is just
if the permutation is even and not the identity and
is the permutation is odd. Taking into account the identity, we can say
if
is even and
is
is odd.
We call this character and the corresponding representation the standard representation,
. Check that the inner product of the standard character with itself is
, confirming the fact that this representation is irreducible.
Now, let that unknown third and final (recall that has
conjugacy classes, one for each partition of
, so we’re looking at three irreducible characters) irreducible character be
: Let
for transpositions
and
for 3-cycles
. To find
and
, we’re going to leverage the orthogonality of irreducible characters. Indeed,
and utilizing the other character we have
Putting these numbers together, we get
and
Thus, is simply the sign of
! We call it the sign character:
.
All in all, we now have the character table of ! While
While to an outsider this table might seem to contain independent pieces of information, the restrictions induced by the deeper symmetries of the group means that in reality the entire table can be filled out given just two entries:
and
.
Throughout mathematics, the notions we tend to study are the ones that strike the right balance between generality and structure. A specific object, while it may have much structure to analyze, would be incapable of describing the bigger, more general picture. On the other hand, too abstract an object would lose all restrictions, making it difficult to conjecture patterns about them in the first place.
Onto !
Let’s continue our analysis of symmetric groups with the next one: . As there are
integer partitions of
, we’ll have a total of
irreducible characters (a conjugacy class of
corresponds to a partition of
: for instance,
corresponds to two
cycles like
and
corresponds to
cycles such as
).
As usual, we have the trivial character: , that returns
for all
. In much the same way as last time, we can construct a natural representation for
, that assigns a
to the corresponding
by
permutation matrix, as viewed as an element of
. This won’t be irreducible however, as again, the vectors that have all coordinates equal in
will be invariant under the action of the
‘s. The
dimensional complement of this invariant line will be also be invariant, and that is our standard representation, which character
. Doing the computations, we get
,
,
and
.
It’s time to invoke orthogonality! We still have two unknown characters: and
. Using the sum of squares formula, we have
, which implies
, which forces
and
. Letting
take on values
,
,
and
and using the three equations:
we get:
Adding the first two equations, . Notice that the
‘s must be integers, so this is a linear Diophantine equation. Upon solving, we get
and
, for
. Adding the last two equations, we get
and so
, and substituting these expressions into the first equation yields
.
We do the same drill with (which takes on the values
) to get that
,
,
and
for
.
Lastly, we have an equation involving both the ‘s and
‘s as
, which gives us
. This implies
and
by SFFT.
This completes our character table for – just using orthogonality.
But dealing with those many variables all at once was not pleasant, to say the least. The trouble there was that we’re taking the inner product of two rows with each other, and it so happened that one of the rows consisted entirely of unknowns, giving us simultaneous equations, each with three unknowns, which is bound to be computationally heavy.
Instead, notice what happens when take the inner product of two columns, viewed as normal vectors in Euclidean space. For instance, take the st and
rd column from the table above. We get
The two columns are orthogonal!
As you might’ve guessed, in general, any two columns are orthogonal! Mathematically, we would right this as
given that and
come from different columns – that is,
and
are not conjugate to each other. This expression can be proved by starting with the function
for a
defined such that
if
is conjugate to
and
is zero otherwise. Then, the result follows as soon as you write this function as a linear combination of the irreducible characters, which you can do as
is clearly a class function.
In general, this is extremely useful if we have determined all characters except for one: we could just inner product each column with column corresponding to the identity element, which we would know, allowing us to deal with one variable at a time. As a bonus, we don’t have to know the size of each conjugacy class any more, which we required when dealing with row orthogonality.
Tensor Products
Recall that we have the notion of the tensor product of two representations – a tool that we can use to possibly build and
from
,
and
. In that direction, we will derive an expression for the character of a tensor product. But before that, we look at a slightly different expression for a character of a representation.
Let’s fix a basis for
. Recall that we have a corresponding basis,
for
, the dual space of
: Given a
,
is the coefficient of
in the expansion of
in terms of the
basis. Thus, the matrix representation (with respect to
) for
has
-entry
. Taking the sum of the diagonal entries to get the trace, we have
In turns out that this expression for the trace of a linear operator and thus the character is quite versatile, primarily because it does not rely on concrete matrices all that much.
Now we can work with the character of a tensor product better: Let be a vector space with basis
. Then a basis for
is
. A corresponding dual basis for
would be
, where we define
and extend linearly. That is,
extracts the coefficient of
in the expansion of the input in the basis
. That gives us
for an elementary tensor
is , which is also just
.
All in all, armed with this new formula for the trace, we have
So the tensor product of representations just has the effect of multiplying the corresponding characters. In fact, going back to our character table for , we can see that
– a much more hassle free way of constructing new characters out of old ones, which we’ll explore and use heavily in upcoming posts!
Before You Go…
Here’s another fact to illustrate the intimate connection between the irreducible characters of a group and its conjugacy classes. Here, denote the conjugacy classes of
, and
for a
denotes the conjugacy class of
.
Consider , an automorphism of
. There is an natural way by which we can view
as acting on
: we define
, where
is an arbitrary representative of
.
Next, we let act on the set
of irreducible characters of
by setting
. Put together, this means that we’ve defined
actions on the sets
and
.
Given this setup, the proposition is that
In other words, the number of fixed points corresponding to either of the two group actions is the same.
To prove this, we’ll translate this into inner products. First, note that the operation is actually an action, in that
for all
. This follows for the same reason the averaging inner product expression used in a proof of Maschke’s theorem:
At this point, note that is a bijection from
into itself, so
will still run through
as
varies over
. Hence, we have
, where the last equality follows as
is irreducible.
Now we’re free to use the orthogonality relations! We get if
fixes
and is
otherwise. Adding all these inner products up, we get:
where . Switching the order of the summations (cause they’re always in the wrong order), we get
Aha, look at the sum now – we can invoke column orthogonality that saved us earlier! Doing that and collecting terms, we get
where is
if
and
otherwise. Recognising that the summand is a class function, we get
and so
which is the cardinality of the first set, the fixed points under the action on the conjugacy classes.