Quantcast
Channel: physics.quant-ph – Annoying Precision
Viewing all articles
Browse latest Browse all 10

The Heisenberg picture of quantum mechanics

$
0
0

In an earlier post we introduced the Schrödinger picture of quantum mechanics, which can be summarized as follows: the state of a quantum system is described by a unit vector \psi in some Hilbert space L^2(X) (up to multiplication by a constant), and time evolution is given by

\displaystyle \psi \mapsto e^{ \frac{H}{i \hbar} t} \psi

where H is a self-adjoint operator on L^2(X) called the Hamiltonian. Observables are given by other self-adjoint operators F, and at least in the case when F has discrete spectrum measurement can be described as follows: if \psi_k is a unit eigenvector of F with eigenvalue F_k, then F takes the value F_k upon measurement with probability \left| \langle \psi, \psi_k \rangle \right|^2; moreover, the state vector \psi is projected onto \psi_k.

The Heisenberg picture is an alternate way of understanding time evolution which de-emphasizes the role of the state vector. Instead of transforming the state vector, we transform observables, and this point of view allows us to talk about time evolution (independent of measurement) without mentioning state vectors at all: we can work entirely with the algebra of bounded operators. This point of view is attractive because, among other things, once we isolate what properties we need this algebra to have we can abstract them to a more general setting such as that of von Neumann algebras.

In order to get a feel for the kind of observables people actually care about, we won’t study a finite toy model in this post: instead we’ll work through some classical (!) one-dimensional examples.

The Heisenberg picture

From our above description of measurement of an observable F it follows that the expected value \mathbb{E}(F) can be given by the elegant formula

\displaystyle \mathbb{E}(F) = \langle \psi, F \psi \rangle = \langle F \psi, \psi \rangle.

Since the characteristic function \mathbb{E}(e^{iF}) of a random variable completely determines it, it is possible in principle to replace knowledge of the state vector \psi with knowledge of the expectation \mathbb{E} it induces on observables. In the theory of von Neumann algebras, \mathbb{E} is referred to as a state for this reason.

Now, as \psi evolves according to the Schrödinger equation, the expectation of an observable F evolves as

\displaystyle \langle \psi, F \psi \rangle \mapsto \langle \psi, e^{- \frac{H}{i \hbar} t} F e^{ \frac{H}{i \hbar} t} \psi \rangle.

In the Schrödinger picture, we keep our algebra of observables A invariant and modify the state \psi (or equivalently the expectation \mathbb{E}), but in the Heisenberg picture, we keep the expectation \mathbb{E} invariant and modify the algebra of observables A by the one-parameter group of inner automorphisms

\displaystyle \phi_t :  F \mapsto e^{- \frac{H}{i \hbar} t} F e^{ \frac{H}{i \hbar} t} = F(t).

This is quite an elegant way to think about time evolution: it tells us that we can delay thinking about states until we actually want to compute probabilities. At any point before then, we can think directly about how observables are changing, and consequently we don’t need to mention states at all to talk about properties of observables which don’t depend on initial conditions.

Now that we’ve conceived of time evolution as a one-parameter group of inner automorphisms \phi_t : A \to A of the algebra of observables, we can take its derivative, which is precisely the inner derivation

\displaystyle F \mapsto \left[ i \frac{H}{\hbar}, F \right] = \frac{i}{\hbar} \left( HF - FH \right)

by a simple calculation. Recalling that, for a one-parameter group of automorphisms \phi_t with associated derivation D, we have at least formally the Taylor expansion

\displaystyle \phi_t = \sum_{n \ge 0} \frac{D^n}{n!} t^n

it follows that

\displaystyle \phi_t(F) = F(t) = F + \left[ i \frac{H}{\hbar}, F \right] t + \frac{1}{2!} \left[ i \frac{H}{\hbar}, \left[ i \frac{H}{\hbar}, F \right] \right] t^2 + ....

In particular, time evolution is completely determined by the commutator of the Hamiltonian with any observable. Note that F is invariant under time evolution if and only if it commutes with the Hamiltonian, and consequently note that the infinitesimal generator of any one-parameter group of symmetries of the Hamiltonian gives a self-adjoint operator by Stone’s theorem, hence gives a conserved observable. This is how Noether’s theorem appears in quantum mechanics.

Particle on a line

The most basic example is the free particle on \mathbb{R}. Here the Hamiltonian is H = \frac{p^2}{2m} where m is mass (a constant) and p = -i \hbar \frac{\partial}{\partial x} is momentum (an observable), which is, up to normalization, the infinitesimal generator of translation. (Note that translation clearly acts by a one-parameter group of unitary transformations on L^2(\mathbb{R}), so Stone’s theorem assures us we have a self-adjoint operator here.) This gives

\displaystyle H = \frac{- \hbar^2 }{2m} \frac{\partial^2}{\partial x^2}

which is, up to normalization, the ordinary Laplacian. Formally, the eigenvectors of H with non-negative eigenvalues are e^{ikx}, k \in \mathbb{R} with corresponding eigenvalues

\displaystyle E_k = \frac{(\hbar k)^2}{2m}.

We have chosen these eigenvectors because they are also eigenvectors of the momentum operator with eigenvalues \hbar k, which recovers the de Broglie relation p = \hbar k. (The other de Broglie relation, E = \hbar \omega, is already built into the Schrödinger equation since an eigenvector for the Hamiltonian with eigenvalue E is multiplied by e^{ \frac{E}{i \hbar} t} = e^{-i \omega t} in time evolution.) Note that these “eigenvectors” do not actually exist in L^2(\mathbb{R}). There is a formalism for dealing with this, but I don’t know it; in any case it can be dealt with.

Time evolution for a single eigenvector e^{ikx} is given by

\displaystyle e^{ikx} \mapsto e^{ \frac{E_k}{i \hbar} t} e^{ikx} = e^{i(kx - \omega_k t)}.

Of course multiplying the state vector by a nonzero complex number doesn’t affect anything we can measure about the state. It’s only when two or more eigenvectors are added together that the multiplication above manifests itself as (what appears to be) interference between waves. In fact, the above describes precisely a plane wave with angular frequency \omega_k = \frac{E_k}{\hbar} and wavenumber k.

Since H is a scalar multiple of the square of p, it follows that [H, p] = 0, so we recover conservation of momentum. The operator x (more precisely, multiplication by x), whose eigenvalues describe the position of our free particle, should not commute with either H or p since it should not be conserved. Instead, we have the canonical commutation relation

\displaystyle [x, p] = - i \hbar \left[ x, \frac{\partial}{\partial x} \right] = i \hbar

which gives (using the fact that [-, x] is a derivation)

\displaystyle [H, x] = \left[ \frac{p^2}{2m}, x \right] = \frac{p}{m} \left[ p, x \right] = - i \hbar \frac{p}{m}.

This gives

\displaystyle x(t) = x + \frac{p}{m} t

so we recover the familiar fact that \frac{p}{m} is the velocity of a free particle.

Particle in a box

In order to work with some eigenvectors which actually exist in L^2(\mathbb{R}) (hence which give rise to well-defined probability distributions), let’s restrict our formerly free particle to a box [0, L]. This is equivalent to requiring that the state vector vanish outside of this interval. If we further require that the state vector is continuous, then it must vanish at 0 and L. The eigenspace of the Hamiltonian with eigenvalue E_k = \frac{(\hbar k)^2}{2m} is spanned by e^{\pm ikx}, and in each of these eigenspaces we can find normalized eigenvectors

\displaystyle \psi_n = \sqrt{ \frac{2}{L} } \sin \frac{n \pi x}{L}

vanishing at 0 and L whenever k = \frac{n \pi }{L}, n \in \mathbb{N}. This is a complete list of eigenvectors of the Laplacian vanishing at 0 and L, and time evolution is given by

\displaystyle \sqrt{ \frac{2}{L} } \sin \frac{n \pi}{L} x \mapsto \sqrt{ \frac{2}{L} } e^{ \frac{E_k}{i \hbar} } \sin \frac{n \pi}{L} x

where

\displaystyle \displaystyle E_k = \frac{\hbar^2 \pi^2}{2mL^2} n^2.

Now that we’ve restricted ourselves to a compact space, we finally get a system in which the energy eigenvalues are discrete, or quantized (from which quantum mechanics gets its name). Intuitively speaking, a particle in a box behaves like a wave, constantly bouncing against the walls; if it oscillates at the wrong frequency, destructive interference would eventually cause it to disappear. Only certain frequencies, dictated by the shape of the box, survive.

Note that the lowest energy eigenvalue is not zero; it occurs when n = 1. So unlike the classical case, a quantum particle in a box cannot have zero energy.

Particle in a potential

Next we consider the case when a particle on \mathbb{R} is subject to some potential. Potentials modify momentum, but the relationship between momentum and velocity should remain intact, so we still want

\displaystyle \frac{d}{dt} x(t) = \frac{p(t)}{m}

hence we still want [H, x] = -i \hbar \frac{p}{m}. We can guarantee this whenever H has the form

\displaystyle H = \frac{p^2}{2m} + V(x)

where V(x) is a multiplication operator, since [V(x), x] = 0, and in fact V(x) is the desired potential. Potentials, being not translation-invariant in general, break conservation of momentum, and we instead have [H, p] = [V(x), p]. Now, let us suppose that

\displaystyle V(x) = \sum_{n \ge 0} v_n x^n

is a nice analytic function of x. By induction on the relation [x, p] = i \hbar, using the fact that [-, p] is a derivation, we conclude that [x^n, p] = i \hbar n x^{n-1}, hence (at least formally)

\displaystyle [H, p] = [V(x), p] = i \hbar V'(x).

From this it follows that

\displaystyle \frac{d}{dt} p(t) = - V'(x)

which recovers Newton’s second law F = \frac{d}{dt} p, remembering that F = - V' classically.

We can deduce all this despite the fact that for arbitrary V it is not at all obvious how to directly write down the eigenvectors or eigenvalues of the corresponding Hamiltonian.

A particularly simple case occurs when V(x) = -Fx for some constant F. Then we compute that [H, p] = -i \hbar F, hence that

\displaystyle x(t) = x + \frac{p}{m} t + \frac{1}{2} \frac{F}{m} t^2

\displaystyle p(t) = p + Ft

exactly as in the classical case. I do not know what the eigenvectors of the Hamiltonian look like in this case.

Another simple case of a particle in a potential is the quantum harmonic oscillator, where V(x) = \frac{1}{2} m \omega^2 x^2 for some constant \omega, so that

\displaystyle H = \frac{p^2}{2m} + \frac{m \omega^2 x^2}{2}.

This gives

\displaystyle [H, p] = i \hbar m \omega^2 x

which, together with the relation [H, x] = -i \hbar \frac{p}{m}, implies that the iterated commutators of either p or x with H are (up to normalizing factors) periodic, alternating between some constant times p and some constant times x. In fact, we have

x(t) = x \cos \omega t + \frac{p}{m \omega} \sin \omega t

p(t) = p \cos \omega t - m \omega x \sin \omega t

again exactly as in the classical case of, for example, a spring-mass system. Note that in this special case, conservation of energy is equivalent to the Pythagorean theorem!

The quantum harmonic oscillator is important in quantum field theory for reasons I don’t understand yet. Part of its basic importance to quantum mechanics can be understood as follows: for an arbitrary potential, expanding V(x) out in the neighborhood of a critical point, the linear term vanishes and so generically we get a quadratic term (plus lower order terms), hence to second order a harmonic oscillator.

There is a very elegant way to write down the eigenvectors of the Hamiltonian in this case using Dirac’s ladder method, but I think it would be best to leave such considerations until I understand the harmonic oscillator better.

Generalizations

One can take tensor products of any of the examples above to get examples in higher dimensions; for example, one can can consider particles in boxes in \mathbb{R}^n for any n. More generally one can consider particles on any Riemannian manifold, where the kinetic term of the Hamiltonian is taken to be a suitable multiple of the Laplacian. Particularly interesting cases include manifolds with large symmetry groups, since the eigenspaces of the Laplacian break up into irreducible representations of these groups.


Viewing all articles
Browse latest Browse all 10

Trending Articles