 ### Lei Mao

Machine Learning, Artificial Intelligence, Computer Science.

# Expected Value and Variance from the Perspective of Quantum Theory

### Introduction

Sometimes, quantum theory is a little bit difficult to understand. This is because we could hardly build connections between quantum theory and our actual daily experiences. If there were some explanations to our experience and common sense, probably understanding quantum theory would be easier. Expected value and variance could be easily connected to and explained by quantum theory.

In this blog post, I would like to derive how expected value and variance are connected to quantum theory, which is the fundamentals to Heisenberg’s Uncertainty Principle.

### Quantum Theory

In quantum theory, observation is just applying an observable operator to the system superposition state vector and modifying the system state.

Quantum theory postulates that:

1. To each physical observable there corresponds a hermitian operator $\Omega$.
2. The eigenvalues of a hermitian operator $\Omega$ associated with a physical observable are the only possible values observable can take as a result of measuring it on any given state. Furthermore, the eigenvectors of $\Omega$ form a basis for the state space.

The first postulate assumes for each physical observable, such as length, velocity, momentum, etc., there corresponds a hermitian operator (matrix) $\Omega$.

The second postulate assumes that the eigenvectors of $\Omega$ forms a basis for the system state superposition. The eigenvalues of corresponding to the eigenvectors of $\Omega$ are the only possible values we could observe as a result of measurement.

According to quantum theory, a system state is a superposition of basic states, and the system state is determined once we take a measure of it. Concretely, we have a normalized system state superposition $\psi$ consisting of $n$ basic states and $n$ could be infinitely large.

$| \psi \rangle = c_0 | x_0 \rangle + c_1 | x_1 \rangle + \cdots + c_{n-1} | x_{n-1} \rangle$

where

$|c_0|^2 + |c_1|^2 + \cdots + |c_{n-1}|^2 = 1$

$| x_0 \rangle$, $| x_1 \rangle$, $\cdots$, $| x_{n-1} \rangle$ are the eigenvectors of the hermitian operator $\Omega$ corresponding to the physical observable, and they form orthonormal basis due to the special property of hermitian matrix. Note that $c_i$ is a complex number and $|c_i|^2 = c_i \overline{c}_i$ where $\overline{c}_i$ is the conjugate of $c_i$.

Before taking the measurement, the system is a superposition of the $n$ basic states, meaning that a cat is both live and dead in the black box. After taking the measurement, the system collapse into only one of the $n$ basic states, $| x_{i} \rangle$, with probability $|c_i|^2$, meaning that a cat could only be live or dead after the black box is opened and we look into the box. The observed value $\lambda_i$ of the measurement is the eigenvalue corresponding to the eigenvector $| x_{i} \rangle$. The observed value $\lambda_i$ is a real number because the eigenvalues of hermitian matrix is real, which matches to what we observed daily in the real world.

This might be somewhat counter-intuitive to people, as the system state is a superposition and the basic states of the system is determined by the physical observables. This is normal because this is not how we people usually perceive the world. We would just accept these for now if we could not understand.

Mathematically, let’s apply the hermitian matrix $\Omega$ corresponding to the physical observable to the system state superposition $\psi$.

\begin{align} \Omega | \psi \rangle &= c_0 \Omega | x_0 \rangle + c_1 \Omega | x_1 \rangle + \cdots + c_{n-1} \Omega | x_{n-1} \rangle \\ &= c_0 \lambda_0 | x_0 \rangle + c_1 \lambda_1 | x_1 \rangle + \cdots + c_{n-1} \lambda_{n-1} | x_{n-1} \rangle \\ \end{align}

Note that we used the property of eigenvalues and eigenvectors $\Omega | x_i \rangle = \lambda_i | x_i \rangle$.

### Expected Value

$\Omega | \psi \rangle$ and $|\psi\rangle$ is are vectors of same dimensions. Let’s check what the inner product of $\Omega | \psi \rangle$ and $|\psi\rangle$ is.

\begin{align} \langle \Omega\psi, \psi \rangle &= \langle c_0 \lambda_0 | x_0 \rangle + c_1 \lambda_1 | x_1 \rangle + \cdots + c_{n-1} \lambda_{n-1} | x_{n-1} \rangle, c_0 | x_0 \rangle + c_1 | x_1 \rangle + \cdots + c_{n-1} | x_{n-1} \rangle \rangle \\ &= \sum_{i=0}^{n-1} \sum_{j=0}^{n-1} \langle c_i \lambda_i | x_i \rangle, c_j | x_j \rangle \rangle \\ &= \sum_{i=0}^{n-1} \sum_{j=0}^{n-1} c_i \overline{c}_j \lambda_i \langle | x_i \rangle, | x_j \rangle \rangle \end{align}

Because the eigenvectors of $\Omega$ is orthonormal, $\langle | x_i \rangle, | x_j \rangle \rangle = 1$ if $i = j$, otherwise $\langle | x_i \rangle, | x_j \rangle \rangle = 0$, we further have

\begin{align} \langle \Omega\psi, \psi \rangle &= \sum_{i=0}^{n-1} \sum_{j=0}^{n-1} c_i \overline{c}_j \lambda_i \langle | x_i \rangle, | x_j \rangle \rangle \\ &= \sum_{i=0}^{n-1} |c_i|^2 \lambda_i \end{align}

This is exactly the expected value of observation! We could compute the expected value of observation by just computing an inner product.

Note that $\langle \Omega\psi, \psi \rangle = \langle \psi, \Omega\psi \rangle$ if $\Omega$ is hermitian, although normally (complex) inner products are not commutative. We denote $\langle \Omega \rangle_{\psi} = \langle \Omega\psi, \psi \rangle = \langle \psi, \Omega\psi \rangle$.

Mathematically, if the physical observable is $X$, $\mathbb{E}(X) = \langle \Omega \rangle_{\psi}$.

### Variance

Given hermitian operator $\Omega$, let’s further introduce another hermitian operator

$\Delta_{\psi}(\Omega) = \Omega - \langle \Omega \rangle_{\psi} I$

We would like to see what the expected value of the observation corresponding to this hermitian operator.

\begin{align} \langle \Delta_{\psi}(\Omega) \rangle_{\psi} &= \langle \Delta_{\psi}(\Omega)\psi, \psi \rangle \\ &= \langle ( \Omega - \langle \Omega \rangle_{\psi} I )\psi, \psi \rangle \\ &= \langle \Omega \psi, \psi \rangle - \langle \langle \Omega \rangle_{\psi} I \psi, \psi \rangle \\ &= \langle \Omega \rangle_{\psi} - \langle \Omega \rangle_{\psi} \langle \psi, \psi \rangle \\ \end{align}

Because

\begin{align} \langle \psi, \psi \rangle &= |c_0|^2 + |c_1|^2 + \cdots + |c_{n-1}|^2 \\ &= 1 \end{align}

Therefore,

\begin{align} \langle \Delta_{\psi}(\Omega) \rangle_{\psi} &= 0 \end{align}

Essentially this measurement corresponding to the hermitian operator $\Delta_{\psi}(\Omega)$ is taking the measurement on the physical observable minus its expected value, whose expected value is always zero.

Mathematically, if the physical observable is $X$, $\mathbb{E}(X - \mu) = 0$ where $\mu = \mathbb{E}(X)$.

How about variance? Mathematically, variance is defined as

\begin{align} \mathbb{V}(X) &= \mathbb{E}((X - \mu)^2) \\ &= \mathbb{E}(X^2) - \mathbb{E}(X)^2 \end{align}

What is the observable in this case? $(X - \mu)^2$. Its corresponding hermitian operator must be $(\Delta_{\psi}(\Omega)) (\Delta_{\psi}(\Omega))$!

Therefore, the variance of the observation is $\langle (\Delta_{\psi}(\Omega)) (\Delta_{\psi}(\Omega)) \rangle_{\psi}$ and we define $\mathbb{V}_{\psi}(\Omega) = \langle (\Delta_{\psi}(\Omega)) (\Delta_{\psi}(\Omega)) \rangle_{\psi}$

If you could not be convinced by this. Let’s show a formal proof.

\begin{align} (\Delta_{\psi}(\Omega)) (\Delta_{\psi}(\Omega)) &= (\Omega - \langle \Omega \rangle_{\psi} I) (\Omega - \langle \Omega \rangle_{\psi} I) \\ &= \Omega^2 - 2 \langle \Omega \rangle_{\psi} \Omega + \langle \Omega \rangle_{\psi}^2 I \end{align} \begin{align} \Omega^2 | \psi \rangle &= c_0 \Omega^2 | x_0 \rangle + c_1 \Omega^2 | x_1 \rangle + \cdots + c_{n-1} \Omega^2 | x_{n-1} \rangle \\ &= c_0 \lambda_0^2 | x_0 \rangle + c_1 \lambda_1^2 | x_1 \rangle + \cdots + c_{n-1} \lambda_{n-1}^2 | x_{n-1} \rangle \\ \end{align} \begin{align} \langle \Omega^2 \rangle_{\psi} &= \langle \Omega^2\psi, \psi \rangle \\ &= \langle c_0 \lambda_0^2 | x_0 \rangle + c_1 \lambda_1^2 | x_1 \rangle + \cdots + c_{n-1} \lambda_{n-1}^2 | x_{n-1} \rangle, c_0 | x_0 \rangle + c_1 | x_1 \rangle + \cdots + c_{n-1} | x_{n-1} \rangle \rangle \\ &= \sum_{i=0}^{n-1} \sum_{j=0}^{n-1} \langle c_i \lambda_i^2 | x_i \rangle, c_j | x_j \rangle \rangle \\ &= \sum_{i=0}^{n-1} \sum_{j=0}^{n-1} c_i \overline{c}_j \lambda_i^2 \langle | x_i \rangle, | x_j \rangle \rangle \\ &= \sum_{i=0}^{n-1} |c_i|^2 \lambda_i^2 \\ &= \mathbb{E}(X^2) \end{align} \begin{align} \mathbb{V}_{\psi}(\Omega) &= \langle (\Delta_{\psi}(\Omega)) (\Delta_{\psi}(\Omega)) \rangle_{\psi} \\ &= \langle (\Delta_{\psi}(\Omega)) (\Delta_{\psi}(\Omega))\psi, \psi \rangle \\ &= \langle (\Omega^2 - 2 \langle \Omega \rangle_{\psi} \Omega + \langle \Omega \rangle_{\psi}^2 I) \psi, \psi \rangle \\ &= \langle \Omega^2\psi, \psi \rangle -2 \langle \Omega \rangle_{\psi} \langle \Omega \psi, \psi \rangle + \langle \Omega \rangle_{\psi}^2 \langle \psi, \psi \rangle \\ &= \mathbb{E}(X^2) -2 \langle \Omega \rangle_{\psi}^2 + \langle \Omega \rangle_{\psi}^2 \\ &= \mathbb{E}(X^2) - \langle \Omega \rangle_{\psi}^2\\ &= \mathbb{E}(X^2) - \mathbb{E}(X)^2\\ &= \mathbb{V}(X) \\ \end{align}

This concludes the proof.

### Conclusions

We have learned how quantum theory is related to the physical observations via mathematics. It is extremely amazing that statistics could be explained using quantum theory.