I like the path integral. The question of whether the path integral or canonical quantization is "more fundamental" is purely ideological, and impossible to answer. On the one hand, canonical quantization is rigorously defined in a much wider setting than the path integral; on the other hand, canonical quantization introduces "order ambiguities" in quantizing certain operators, and requires renormalization to define operators like $\phi(0)^2,$ while both of these issues are automatically avoided in the path integral. These advantages make me, personally, a path integral devotee.
One big advantage of canonical quantization over the path integral, however, is that it automatically gives you the canonical (anti)commutation relations between field operators and their conjugate momenta; in fact, this is basically the definition of canonical quantization: it gives a prescription for defining canonical (anti)commutators in terms of a bracket on the phase space of the classical theory.
If you want to study systems primarily using the path integral, however, then you should know how to derive canonical (anti)commutation relations from the path integral perspective. This short note explains how that calculation works for quantum mechanical systems with standard kinetic terms; similar considerations apply for quantum field theories, or to systems with nonstandard kinetic terms, and I think the basic quantum mechanical examples should suffice to explain the basic idea. The calculation has the flavor of a Ward identity calculation for the transformation where all fields are shifted by a constant — $\phi(x) \mapsto \phi(x) + c$ — except that this transformation is generally not a symmetry. We'll derive everything from scratch, though, without actually referencing Ward identities.
In section 1, I will derive canonical commutation relations for a single bosonic degree of freedom using the path integral.
In section 2, I will derive canonical anticommutation relations for $N$ fermionic degrees of freedom using the path integral.
Prerequisites: Path integral formulation of quantum mechanics. Section 2 requires some knowledge of Grassmann variables, but not much. Several times in this post I use the term "fields" to refer to the position function $x(t)$ in bosonic quantum mechanics, or the Grassmann-valued function $\psi(t)$ in fermionic quantum mechanics. This is because I tend to think of quantum mechanics as a one-dimensional QFT.
Table of Contents
1. Canonical commutation relations
Consider a single bosonic particle with the usual kinetic term. The classical theory is a theory of paths $x(t)$ with action
$$S[x] = \int dt\, \left( \frac{1}{2} m \dot{x}^2 - V(x) \right).$$
The variation of this action under a compact perturbation $x(t) \mapsto x(t) + \epsilon \delta x(t)$ is
$$\delta S = \int dt\, \left( - m \ddot{x} - V'(x) \right) \delta x, \tag{1}$$
which just gives us the standard equation of motion $m \ddot{x} = - V'(x)$ via the principle of stationary action.
If $P[x](t)$ and $Q[x](t')$ are functionals of $x(t)$ and its derivatives at time $t,$ then time-ordered correlation functions are computed in the path integral by the formal expression
$$\langle P[x](t) Q[x](t') \rangle = \frac{\int \mathcal{D} x\, e^{i S[x]} P[x](t) Q[x](t')}{\int \mathcal{D} x\, e^{i S[x]}}.$$
Note that I haven't written the time-ordering symbol explicitly; $\langle X \rangle$ will always denote a time-ordered correlation function, not the vacuum expectation value $\langle \Omega | X | \Omega \rangle.$
Now, suppose we consider the relabeling $y(t) = x(t) + \epsilon \rho(t),$ where $\rho(t)$ is some fixed, compactly supported function that we add to every possible path $x(t).$ This is just a change of variables in the path integral; it seems manifest that it should have trivial Jacobian, so we should have
$$\int \mathcal{D} x\, e^{i S[x]} = \int \mathcal{D} x\, e^{i S[y]}.$$
In particular, this means that the correlation functions should satisfy
$$\langle P[x](t) Q[x](t') \rangle = \frac{\int \mathcal{D} x e^{i S[y]} P[y](t) Q[y](t')}{\int \mathcal{D} x\, e^{i S[x]}}. \tag{2}$$
In the limit as $\epsilon$ goes to zero, we can expand $P$ as
$$P[y](t) = P[x](t) + \epsilon \left( \frac{\partial P}{\partial x}[x](t) \rho(t) + \frac{\partial P}{\partial \dot{x}}[x](t) \dot{\rho}(t) + \dots \right) + O(\epsilon^2),$$
and similarly for $Q$.
Let's package the whole parenthetical term and call it $\delta_{\rho} P[x](t).$ We can also write the corresponding variation in the action as $S[y] = S[x] + \epsilon \delta_{\rho} S[x].$ Equation (2) can then be written as
$$\begin{split} \langle P[x](t) Q[x](t') \rangle & = \langle P[x](t) Q[x](t') \rangle + \epsilon \langle \delta_{\rho}P[x](t) Q[x](t') \rangle + \epsilon \langle P[x](t) \delta_{\rho} Q[x](t') \rangle \\ & + i \epsilon \langle \delta_{\rho} S[x] P[x](t) Q[x](t') \rangle + O(\epsilon^2).\end{split}$$
Matching the order-$\epsilon$ terms on both sides of the equation gives us
$$\langle \delta_{\rho}P[x](t) Q[x](t') \rangle + \langle P[x](t) \delta_{\rho} Q[x](t') \rangle + i \langle \delta_{\rho} S[x] P[x](t) Q[x](t') = 0. \tag{3}$$
All that remains is to compute $\delta_{\rho} S[x]$, which is easily found via equation (1) to be
$$\delta_{\rho} S[x] = \int d\tilde{t}\, (- m \ddot{x} - V'(x)) \rho(\tilde{t}).$$
Now, suppose we take $\rho(\tilde{t})$ to be a step function that is equal to one in a neighborhood $[t - \alpha, t + \alpha],$ but zero everywhere else — in particular, zero at $t'.$ Then the integral expression for $\delta S$ simplifies to
$$\delta_{\rho} S[x] = - m (\dot{x}(t+\alpha) - \dot{x}(t-\alpha)) - \int_{t-\alpha}^{t+\alpha} d\tilde{t}\, V'(x).$$
If we plug this into equation (3), then the correlation function that includes $\delta_{\rho} S$ becomes
$$- i m \langle (\dot{x}(t + \alpha) - \dot{x}(t - \alpha)) P[x](t) Q[x](t') \rangle - \dots,$$
where I've suppressed the second term, which depends on $V.$ This first term, in the limit $\alpha \rightarrow 0$, is just the expectation value of the commutator:
$$- i m \langle [x(t), P[x](t)] Q[x](t')\rangle.$$
The time ordering takes care of this for us — in the term with the plus sign the limit $\alpha \rightarrow 0$ produces the operator $x(t) P[x](t),$ while in the term with the minus sign the limit produces the operator $P[x](t) x(t).$
The second term, the one that depends on $V$, is a little more complicated. A priori, we don't have a good way of understanding it. Naively, it looks like the operator $\int_{t-\alpha}^{t+\alpha} d\tilde{t}\, V'(x)$ goes to zero in the limit $\alpha \rightarrow 0,$ but we aren't considering that operator on its own; we're considering its insertion in the time-ordered correlator
$$- i \langle \int_{t-\alpha}^{t+\alpha} d\tilde{t}\, V'(x(\tilde{t})) P[x](t) Q[x](t') \rangle.$$
However, if we make the particular choice $P[x](t) = x(t),$ then since $V'(x)$ depends only on $x(\tilde{t})$ we know that $V'(x(\tilde{t}))$ and $P[x](t)$ commute; in this case, the $\alpha \rightarrow 0$ limit of this correlator actually does vanish.
So, with $P[x](t) = x(t),$ and with $\rho(\tilde{t})$ the aforementioned step function in the limit $\alpha \rightarrow 0,$ our identity (3) becomes
$$\langle Q[x](t') \rangle - i m \langle [\dot{x}(t), x(t)] Q[x](t') = 0,$$
where we have used $\delta_{\rho} x(t) = 1$ and $\delta_{\rho} Q[x](t) = 0.$ Since $m \dot{x}(t)$ is just the momentum operator $p(t),$ a little rearranging makes this equation into
$$\langle [x(t), p(t)] Q[x](t') \rangle = i \langle Q[x](t') \rangle.$$
Since this identity holds in correlation functions where $Q[x](t')$ is completely arbitrary, it must hold in the sense of the operator equation
$$[x(t), p(t)] = i.$$
2. Canonical anticommutation relations
$$\chi_j \chi_k = - \chi_k \chi_j.$$
One can define a notion of integration on functions of these symbols, from which it is possible to construct a path integral for any Grassmann-valued Lagrangian. We won't discuss the details at all, as we won't need them for this discussion.
$$\mathcal{L}[\psi](t) = \frac{i}{2} \sum_{j=1}^{N} \psi_j(t) \dot{\psi}_j(t) - V(\psi), \tag{4}$$
where $V(\psi)$ is some function of all the fields $\psi_1(t), \dots, \psi_N(t).$ Note that the fundamental generators of the Grassmann algebra, the $\chi_j$ symbols we discussed earlier, are different from the fields $\psi_j(t)$ that appear in this expression; in general, we will have something like
$$\psi_j(t) = \sum_{k} c_{jk}(t) \chi_{k},$$
where $\{c_{jk}(t)\}$ is a set of real-valued functions.
$$b(t) = \sum_{j} \beta_j(t) \chi_j$$
$$\psi_j(t) \mapsto \psi_j(t) + \epsilon b_j(t).$$
The discussion leading to equation (3) in the previous section then applies with only minor notational changes. For any functionals $P[\psi](t)$ and $Q[\psi](t')$ of the fields, we obtain the identity
$$\delta_{b} S[\psi](t) = \frac{i}{2} \sum_{j=1}^{N} \int dt\, (b_j(t) \dot{\psi}_j(t) + \psi_j(t) \dot{b}_j(t)) - \sum_{j=1}^{N} \int dt\, b_j(t) \frac{\partial V}{\partial \psi_j}. \tag{6}$$
The second term requires a little bit of care; I haven't told you how to differentiate a function of Grassmann variables. But rest assured that there is a way to define differentiation — the so-called "left Grassmann derivative" — for which equation (6) holds.
$$\delta_{b} S[\psi](t) = \frac{i}{2} \sum_{j=1}^{N} \int dt\, (b_j(t) \dot{\psi}_j(t) - \dot{\psi}_j(t) b_j(t)) - \sum_{j=1}^{N} \int dt\, b_j(t) \frac{\partial V}{\partial \psi_j}.$$
We then use the fact that $b_j(t)$ and $\dot{\psi}_j(t)$ anticommute to obtain
$$\delta_{b} S[\psi](t) = i \sum_{j=1}^{N} \int dt\, b_j(t) \dot{\psi}_j(t) - \sum_{j=1}^{N} \int dt\, b_j(t) \frac{\partial V}{\partial \psi_j}.$$
Comments
Post a Comment