# Least Action

Nontrivializing triviality..and vice versa.

## Group Theory I – O(n), SO(n), U(n), SU(n)

with one comment

I have been trying to learn group theory for a long time. Invariably, the books I come across are either too formal or too cursory. But if I ignore this, there are of course a number of very well written introductions to group theory available on the internet. In this post, I will try not to bore you with what a group is (chances are, you already know, if you’re reading this), but will present a somewhat different perspective of how it fits into our ‘daily’ physics.

Orthogonal and Special Orthogonal Groups

The set of all orthogonal matrices, i.e. matrices satisfying

$A^{T}A = AA^{T} = I$

forms a group denoted by $O(n)$. Now, if $A$ is a real orthogonal matrix of dimension $n$, it has $n(n-1)/2$ independent parameters. This is easily seen by taking $n = 2$. Let me try a more general proof here..

$A = \left(\begin{array}{cccc}a_{11}&a_{12}&a_{13}&\ldots\\a_{21}&a_{22}&a_{23}&\ldots\\a_{31}&a_{32}&a_{33}&\ldots\\ \ldots & \ldots & \ldots &\ldots\end{array}\right)$

Clearly,

$(A^{T}A)_{ij} = \sum_{k}a_{ki}a_{kj} = \sum_{k}a_{ik}a_{jk} = \delta_{ij}$

There are $n$ of these equations with $1$ on the RHS, and there are $(n^2-n)$ such equations with a $0$ on the RHS. But of them, only half are unique, because $i \leftrightarrow j$ yields the same equation. So, the number of independent conditions is only $n(n-1)/2$.

Why is $O(n)$ important? It is the rotation group in n-dimensional Euclidean space. The only restriction we need to impose on it is for the matrices to have a determinant of $+1$ though, which makes them special, and part of the group $SO(n)$. It is well known that rotation matrices in 3 dimensions are orthogonal and have a unit determinant. Note that orthogonality guarantees a determinant of $\pm 1$, but proper rotations can only be part of $SO(n)$.

There is a caveat to this. Elements of both $O(n)$ and $SO(n)$ are parametrized in general by continuous variables. For a rotation in $2$-dimensional space, one angle is sufficient. For $3$-dimensions, we have the three Euler angles. So each rotation matrix is a function of $n(n-1)/2$ angles, which are continuous. Groups that depend on continuously varying parameters are called Lie Groups. Also, since angles vary over closed, finite intervals, these groups are also said to be compact.

Unitary and Special Unitary Groups

The set of unitary matrices, i.e. matrices satisfying

$A^{\dagger}A = AA^{\dagger} = I$

forms a group denoted by $U(n)$. Let’s use the above idea to find the number of independent parameters of a unitary matrix of dimension $n$.

$A = \left(\begin{array}{cccc}a_{11}&a_{12}&a_{13}&\ldots\\a_{21}&a_{22}&a_{23}&\ldots\\a_{31}&a_{32}&a_{33}&\ldots\\ \ldots & \ldots & \ldots &\ldots\end{array}\right)$

The unitarity condition translates to

$(A^{\dagger}A)_{ij} = \sum_{k}a^{*}_{ki}a_{kj} = \sum_{k}a_{ik}a^{*}_{jk} = \delta_{ij}$

In this case, the $i = j$ (diagonal) entries contribute to the only nonvanishing RHS, and the upper and lower triangular equations are all distinct because of additional conditions imposed by the elements being complex. So, in all there are $n^2 - 1$ independent elements of matrix in $U(n)$.

Additionally, if the determinant of the matrix is $+1$, the matrix is said to be special, and part of the special unitary group of order n, i.e. $SU(n)$. By the way, I haven’t shown that any of these groups are indeed groups. For that, you have to show closure under group multiplication, associativity, existence of a unique unit element and of an inverse for every element. Why is $SU(n)$ important? Well, that requires a motivation through $SU(2)$, the ‘spin group’ in nonrelativistic quantum mechanics. I will address this in a subsequent post.

Much of group theory involves a lot of jargon, which can be a bit tricky to connect to the “real” world, if you think like me. But it helps to know the jargon, as in any field you want to grapple with.

A subset $G'$ of a group $G$ which is closed under multiplication, is called a subgroup of G.

Let $g' \in G'$ and $g \in G$. If $gg'g^{-1} \in G'$ for every $g \in G$ and $g' \in G'$, then $G'$ is called an invariant subgroup of G.

Representations

So far, we have looked at matrix representations of the standard orthogonal and unitary groups. These are useful because they let us employ familiar rules of matrix algebra to study the properties of the group in question. Matrix representations are closely associated with symmetries. A good example is the time independent Schrodinger equation,

$H\psi = E\psi$

which is an eigenvalue problem for a stationary state $\psi$. If there exists a group $G$ of transformations under which the Hamiltonian $G$ stays invariant, then

$H_{trans} = RHR^{-1} = H$

for some $R \in G$ represents the group action. In particular, this implies that every element of $G$ commutes with the Hamiltonian (and hence, by a result from linear algebra, it is possible to diagonalize $H$ and $R$ simultaneously for every $R \in G$ — quite a strong result!), i.e. $RH = HR$.

Now,

$R(H\psi) = R(E\psi) = E(R\psi)$

but

$R(H\psi) = H(R\psi) = E(R\psi)$

So, all the transformed states are degenerate and constitute a multiplet.

Suppose $\Omega$ denotes the vector space of all transformed solutions, and has a finite dimension $N$. Then, it has a basis, which we can denote by $\psi_1, \psi_2, \ldots, \psi_{N}$. Since $R\psi_j$ belongs to the multiplet, it can be expanded in terms of this basis, as

$R\psi_j = \sum_{k}r_{jk}\psi_{k}$

This means that there is a matrix $\{r_{jk}\}$ associated with every group element $R$.

Group representations are of two kinds:

1. Irreducible, which means by rotating any element of $\Omega$ with all elements of $G$, we can recover all other elements of $\Omega$, or

2. Reducible, in which case, the vector space $\Omega$ splits into a direct sum of vector subspaces each of which is mapped into itself (but not into another) under the action of $G$, i.e. $\Omega = \Omega_1 \oplus \Omega_2 \oplus \ldots$.