There is a deep relationship between being able to solve a differential equation and its symmetries. Much of the theory of second order linear differential equations is really the theory of infinite dimensional linear algebra. In particular Sturm-Liouville theory is the diagonalization of an infinite dimensional Hermitian operator. However there are deeper relationships, as Miller points out in “Lie theory and special functions”; the relationships between special functions such as Rodrigues’ formulae are related to the Lie algebra and symmetries of the system. Even better in some cases the solutions can be found almost entirely algebraically. Some examples from physics come from the Simple Harmonic Oscillator, the theory of Angular Momentum and the Kepler Problem (using the Laplace Runge Lenz vector). The rest of this article will be devoted to exploring a special case of these relations the Quantum Simple Harmonic Oscillator.
We begin with trying to solve the differential equation for some real positive constants , and $latex\lambda$ with the boundary conditions f vanishes at infinity. This is an eigenvalue equation; this can’t be solved for any constants but only for particular values of for a fixed k and m. By dilations (that is, rescaling units) we can assume without loss of generality and . It is useful to define the momentum operator – this makes everything more physics-like. If this isn’t familiar to you just substitute wherever you see a p.
(The i is chosen to make the operator Hermitian with respect to the inner product: ; where * denotes complex conjugation and f and g are zero at infinity. This identity follows immediately from integration by parts).
Introducing the Hamiltonian operator the differential equation is then the eigenvalue equation (a form familiar to physicists). The Hamiltonian operator has an obvious symmetry to it: it is invariant under rotations in x-p space. That is it is invariant under transformations of the form:
Following the ideas of Sophus Lie we look at the infinitesimal transformations generating this, by taking the derivative at the identity this gives , ; in x-p space it is given by the matrix
This transformation is precisely the Fourier transform: . In particular integration by parts and differentiating under the integral respectively it follows and , so as an operator on functions and .
Now since the square of the Fourier transform in x-p space is negative the identity it has eigenvalues -i and +i and corresponding eigenvectors and .
Now we introduce the commutator of operators and in particular (since x and its derivative don’t commute). Consequently by linearity .
Simple calculations show that , , and . These last two relations allow us to find the spectrum of H, that is the values of for which the differential equation is solvable!
If the differential equation can be solved for some , then using the commutation relations shows and . Thus lowers the eigenvalue by 1 and is called a lowering operator, and raises the eigenvalue by 1 and is called a raising operator.
However we can not lower indefinitely: H is positive semidefinite, (where the dagger indicates Hermitian conjugation with respect to the inner product), so $\lambda$ must be non-negative. Thus there is a function for which (which of course satisfies the differential equation trivially). On this state .
Moreover since any arbitrary solution can be brought to by repeated lowerings (applications of a), and lowering then raising gives a multiple of the original function every solution can be obtained by raising . Thus the only possible eigenvalues are n+1/2 for n=0,1,2,….
What are the corresponding eigenvectors? Well implies that , which has solutions for some constant . Then the solution with is up to a constant factor where are the Hermite polynomials. Consequently we have found all solutions of the second order differential equation just by solving a first order differential equation! (They can also easily be normalized algebraically; that is without doing any integrals, but I won’t show that here).
It is interesting to note all these solutions are invariant under Fourier transform. This is of course a consequence of the Hamiltonian being invariant under Fourier transform, F; if then and thus .
From an abstract point of view what have we done? We have taken an algebra of operators on some Hilbert space generated by self-adjoint operators and satisfying (notice that this implies the vector space can’t be finite dimensional; take the trace of each side). Using this we have shown that the positive definite Hermitian operator has eigenvalues n + 1/2 for n=0,1,2,….
We could choose an explicit representation: the Hilbert space is the space of square integrable functions, x is the multiplication operator and , then in this basis the eigenequation is the differential equation we started with. The solutions in this basis are the Hermite polynomials multiplied by a Gaussian; notice that these functions are orthogonal and complete in being all the eigenfunctions of a Hermitian operator. The formula for the eigenfunctions in terms of raising operators gives rise to a Rodrigues formula for the Hermite polynomials.
However there is nothing canonical about this choice of representation, a different representation is given by the Fourier transform, which acts as a change of basis. That the Hamiltonian is invariant under the Fourier transform means or .
The nicest choice of basis is the one in which H is the (countably infinite dimensional) diagonal matrix with entries 1/2,3/2,5/2,…. It is easy to see that a is the matrix with 1s one row below the diagonal and zeros everywhere else and is its transpose. Representations for x and p can be obtained from and .
It is worth noting that in this derivation it wasn’t enough to have a Lie algebra, that is a Lie bracket, we also needed a multiplication over which the Lie bracket is the commutator – that is a representation.