Lei Mao bio photo

Lei Mao

Machine Learning, Artificial Intelligence, Computer Science.

Twitter Facebook LinkedIn GitHub   G. Scholar E-Mail RSS

Introduction

The 2D plane transformation in geometry is interesting and useful for modern computer vision and photography.


In this blog post, I would like to discuss the mathematics of some common 2D transformation and prove some of its interesting properties.

Translation

The 2D translation could be simply described as

\[\begin{align} \mathbf{x}^{\prime} &= \mathbf{x} + \mathbf{t} \\ &= \begin{bmatrix} x \\ y \\ \end{bmatrix} + \begin{bmatrix} t_x \\ t_y \\ \end{bmatrix} \\ \end{align}\]

Alternatively, we could write the translation operation as matrix multiplication.

\[\begin{align} \mathbf{x}^{\prime} &= \mathbf{x} + \mathbf{t} \\ &= \mathbf{I} \mathbf{x} + \mathbf{t} \\ &= \begin{bmatrix} \mathbf{I} & \mathbf{t} \\ \end{bmatrix} \begin{bmatrix} \mathbf{x} \\ 1 \\ \end{bmatrix} \\ &= \begin{bmatrix} \mathbf{I} & \mathbf{t} \\ \end{bmatrix} \bar{\mathbf{x}} \end{align}\]

Rotation + Translation

Ration Matrix

The 2D rotation matrix $\mathbf{R}$ could be derived using polar coordinates. Suppose $\mathbf{x}^{\prime}$ is obtained by rotating $\mathbf{x}$ around the origin by angle $\theta$.

\[\begin{align} \mathbf{x}^{\prime} &= \begin{bmatrix} x^{\prime} \\ y^{\prime} \\ \end{bmatrix} \\ &= \begin{bmatrix} r \cos(\phi + \theta) \\ r \sin(\phi + \theta) \\ \end{bmatrix} \\ &= r \begin{bmatrix} \cos \phi \cos \theta - \sin \phi \sin \theta \\ \cos \phi \sin \theta + \sin \phi \cos \theta \\ \end{bmatrix} \\ &= r \begin{bmatrix} \cos \theta - \sin \theta \\ \sin \theta + \cos \theta \\ \end{bmatrix} \begin{bmatrix} \cos \phi \\ \sin \phi \\ \end{bmatrix} \\ &= \begin{bmatrix} \cos \theta - \sin \theta \\ \sin \theta + \cos \theta \\ \end{bmatrix} \begin{bmatrix} r \cos \phi \\ r \sin \phi \\ \end{bmatrix} \\ &= \begin{bmatrix} \cos \theta - \sin \theta \\ \sin \theta + \cos \theta \\ \end{bmatrix} \begin{bmatrix} x \\ y \\ \end{bmatrix} \\ &= \mathbf{R} \mathbf{x} \end{align}\]

where

\[\mathbf{R} = \begin{bmatrix} \cos \theta - \sin \theta \\ \sin \theta + \cos \theta \\ \end{bmatrix}\]

Also notice that $\mathbf{R}$ is orthonormal, i.e., $\mathbf{R}\mathbf{R}^{\top} = \mathbf{R}^{\top}\mathbf{R}= \mathbf{I}$, and the determinant $\lvert \mathbf{R} \rvert = 1$.

Rotation + Translation

The 2D rotation followed by translation, also known as the Euclidean transformation, can, therefore, be described as

\[\begin{align} \mathbf{x}^{\prime} &= \mathbf{R} \mathbf{x} + \mathbf{t} \\ &= \begin{bmatrix} \mathbf{R} & \mathbf{t} \\ \end{bmatrix} \begin{bmatrix} \mathbf{x} \\ 1 \\ \end{bmatrix} \\ &= \begin{bmatrix} \mathbf{R} & \mathbf{t} \\ \end{bmatrix} \bar{\mathbf{x}} \end{align}\]

This transformation preserves the Euclidean distance between any two points.


Proof


Suppose we have two points $\mathbf{x}_1$ and $\mathbf{x}_2$. After rotation and translation, the two points become $\mathbf{x}_1^{\prime}$ and $\mathbf{x}_2^{\prime}$, respectively.

\[\begin{align} \lvert \mathbf{x}_1^{\prime} - \mathbf{x}_2^{\prime} \rvert^2 &= (\mathbf{x}_1^{\prime} - \mathbf{x}_2^{\prime})^{\top} (\mathbf{x}_1^{\prime} - \mathbf{x}_2^{\prime}) \\ &= \Big[ \mathbf{R}(\mathbf{x}_1 - \mathbf{x}_2) \Big]^{\top} \Big[ \mathbf{R}(\mathbf{x}_1 - \mathbf{x}_2) \Big] \\ &= (\mathbf{x}_1 - \mathbf{x}_2)^{\top} \mathbf{R}^{\top} \mathbf{R}(\mathbf{x}_1 - \mathbf{x}_2) \\ &= (\mathbf{x}_1 - \mathbf{x}_2)^{\top} (\mathbf{R}^{\top} \mathbf{R}) (\mathbf{x}_1 - \mathbf{x}_2) \\ &= (\mathbf{x}_1 - \mathbf{x}_2)^{\top} \mathbf{I} (\mathbf{x}_1 - \mathbf{x}_2) \\ &= (\mathbf{x}_1 - \mathbf{x}_2)^{\top} (\mathbf{x}_1 - \mathbf{x}_2) \\ &= \lvert \mathbf{x}_1 - \mathbf{x}_2 \rvert^2 \end{align}\]

This concludes the proof.

Scaled Rotation

The 2D scaled rotation, also known as the similarity transformation, scaled the coordinates while rotation by a constant $s$.

\[\begin{align} \mathbf{x}^{\prime} &= s \mathbf{R} \mathbf{x} + \mathbf{t} \\ &= \begin{bmatrix} s \mathbf{R} & \mathbf{t} \\ \end{bmatrix} \begin{bmatrix} \mathbf{x} \\ 1 \\ \end{bmatrix} \\ &= \begin{bmatrix} s \mathbf{R} & \mathbf{t} \\ \end{bmatrix} \bar{\mathbf{x}} \\ &= \begin{bmatrix} a & -b & t_x \\ b & a & t_y \\ \end{bmatrix} \bar{\mathbf{x}} \end{align}\]

This transformation preserves the angle between any two lines.


Proof


Suppose we have two distinct points $\mathbf{x}_1$ and $\mathbf{x}_2$ from line 1, and another two distinct points $\mathbf{x}_3$ and $\mathbf{x}_4$ from line 2. After scaled rotation, the points become $\mathbf{x}_1^{\prime}$, $\mathbf{x}_2^{\prime}$, $\mathbf{x}_3^{\prime}$, and $\mathbf{x}_4^{\prime}$, respectively. The angle between the two lines could be calculated using dot product.

\[\begin{align} \cos \theta^{\prime} &= \frac{(\mathbf{x}_1^{\prime} - \mathbf{x}_2^{\prime}) \cdot (\mathbf{x}_3^{\prime} - \mathbf{x}_4^{\prime})}{\lvert \mathbf{x}_1^{\prime} - \mathbf{x}_2^{\prime} \rvert \lvert\mathbf{x}_3^{\prime} - \mathbf{x}_4^{\prime} \rvert} \\ &= \frac{(\mathbf{x}_1^{\prime} - \mathbf{x}_2^{\prime})^{\top} (\mathbf{x}_3^{\prime} - \mathbf{x}_4^{\prime})}{\lvert \mathbf{x}_1^{\prime} - \mathbf{x}_2^{\prime} \rvert \lvert\mathbf{x}_3^{\prime} - \mathbf{x}_4^{\prime} \rvert} \\ &= \frac{\Big[ s\mathbf{R}(\mathbf{x}_1 - \mathbf{x}_2) \Big]^{\top} \Big[ s\mathbf{R}(\mathbf{x}_3 - \mathbf{x}_4) \Big]}{s \lvert \mathbf{x}_1 - \mathbf{x}_2 \rvert s \lvert\mathbf{x}_3 - \mathbf{x}_4 \rvert} \\ &= \frac{\Big[ \mathbf{R}(\mathbf{x}_1 - \mathbf{x}_2) \Big]^{\top} \Big[ \mathbf{R}(\mathbf{x}_3 - \mathbf{x}_4) \Big]}{ \lvert \mathbf{x}_1 - \mathbf{x}_2 \rvert \lvert\mathbf{x}_3 - \mathbf{x}_4 \rvert} \\ &= \frac{(\mathbf{x}_1 - \mathbf{x}_2) \cdot (\mathbf{x}_3 - \mathbf{x}_4)}{\lvert \mathbf{x}_1 - \mathbf{x}_2 \rvert \lvert\mathbf{x}_3 - \mathbf{x}_4 \rvert} \\ &= \cos \theta \end{align}\]

This concludes the proof.

Affine

The 2D affine transformation uses any arbitrary $2 \times 3$ transformation matrix $\mathbf{A}$.

\[\begin{align} \mathbf{x}^{\prime} &= \mathbf{A} \bar{\mathbf{x}} \\ &= \begin{bmatrix} a_{00} & a_{01} & a_{02} \\ a_{10} & a_{11} & a_{12} \\ \end{bmatrix} \bar{\mathbf{x}} \\ \end{align}\]

This transformation preserves the parallelism between two parallel lines.


Proof


Suppose we have two distinct points $\mathbf{x}_1$ and $\mathbf{x}_2$ from line 1, another two distinct points $\mathbf{x}_3$ and $\mathbf{x}_4$ from line 2, line 1 is parallel with line 2. After affine transformation, the points become $\mathbf{x}_1^{\prime}$, $\mathbf{x}_2^{\prime}$, $\mathbf{x}_3^{\prime}$, and $\mathbf{x}_4^{\prime}$, respectively.


Because line 1 is parallel with line 2, we must have

\[\mathbf{x}_1 - \mathbf{x}_2 = k(\mathbf{x}_3 - \mathbf{x}_4)\]

where $k$ is a non-zero value.

\[\begin{align} \mathbf{x}_1^{\prime} - \mathbf{x}_2^{\prime} &= \mathbf{A} \bar{\mathbf{x}}_1 - \mathbf{A} \bar{\mathbf{x}}_2 \\ &= \mathbf{A} (\bar{\mathbf{x}}_1 - \bar{\mathbf{x}}_2) \\ &= \mathbf{A} \begin{bmatrix} \mathbf{x}_1 - \mathbf{x}_2 \\ 0 \\ \end{bmatrix} \\ &= \mathbf{A} \begin{bmatrix} k(\mathbf{x}_3 - \mathbf{x}_4) \\ 0 \\ \end{bmatrix} \\ &= k \mathbf{A} \begin{bmatrix} \mathbf{x}_3 - \mathbf{x}_4 \\ 0 \\ \end{bmatrix} \\ &= k\mathbf{A} (\bar{\mathbf{x}}_3 - \bar{\mathbf{x}}_4) \\ &= k (\mathbf{A}\bar{\mathbf{x}}_3 - \mathbf{A}\bar{\mathbf{x}}_4) \\ &= k (\mathbf{x}_3^{\prime} - \mathbf{x}_4^{\prime}) \\ \end{align}\]

This concludes the proof.

Projective

The 2D projective transformation, also known as perspective transformation, operates on homogenous coordinates, rather than augmented coordinates.

\[\begin{align} \tilde{\mathbf{x}}^{\prime} &= \tilde{\mathbf{H}} \tilde{\mathbf{x}} \\ &= \begin{bmatrix} \mathbf{h}_{0}^{\top} \\ \mathbf{h}_{1}^{\top} \\ \mathbf{h}_{2}^{\top} \\ \end{bmatrix} \tilde{\mathbf{x}} \\ &= \begin{bmatrix} \mathbf{h}_{0}^{\top} \tilde{\mathbf{x}} \\ \mathbf{h}_{1}^{\top} \tilde{\mathbf{x}} \\ \mathbf{h}_{2}^{\top} \tilde{\mathbf{x}} \\ \end{bmatrix} \\ &= \begin{bmatrix} h_{00} & h_{01} & h_{02} \\ h_{10} & h_{11} & h_{12} \\ h_{20} & h_{21} & h_{22} \\ \end{bmatrix} \begin{bmatrix} \omega x \\ \omega y \\ \omega \\ \end{bmatrix} \\ &= \begin{bmatrix} \omega h_{00} x + \omega h_{01} y + \omega h_{02} \\ \omega h_{10} x + \omega h_{11} y + \omega h_{12} \\ \omega h_{20} x + \omega h_{21} y + \omega h_{22} \\ \end{bmatrix} \\ &= \begin{bmatrix} \omega^{\prime} x^{\prime} \\ \omega^{\prime} y^{\prime} \\ \omega^{\prime} \\ \end{bmatrix} \\ \end{align}\]

where $\tilde{\mathbf{H}}$ is an arbitrary $3 \times 3$ matrix. Notice that $\tilde{\mathbf{H}}$ is homogeneous and $\tilde{\mathbf{H}}_1 \equiv \tilde{\mathbf{H}}_2$ if and only if $\tilde{\mathbf{H}}_1 = k\tilde{\mathbf{H}}_2$ where $k$ is a non-zero value.


The resulting coordinates must be normalized from the homogeneous coordinates.

\[\begin{align} x^{\prime} &= \frac{\mathbf{h}_{0}^{\top} \tilde{\mathbf{x}}}{\mathbf{h}_{2}^{\top} \tilde{\mathbf{x}}} \\ &= \frac{\omega \mathbf{h}_{0}^{\top} \bar{\mathbf{x}}}{\omega \mathbf{h}_{2}^{\top} \bar{\mathbf{x}}} \\ &= \frac{\mathbf{h}_{0}^{\top} \bar{\mathbf{x}}}{\mathbf{h}_{2}^{\top} \bar{\mathbf{x}}} \\ &= \frac{h_{00} x + h_{01} y + h_{02}}{h_{20} x + h_{21} y + h_{22}} \\ \end{align}\] \[\begin{align} y^{\prime} &= \frac{\mathbf{h}_{1}^{\top} \tilde{\mathbf{x}}}{\mathbf{h}_{2}^{\top} \tilde{\mathbf{x}}} \\ &= \frac{\omega \mathbf{h}_{1}^{\top} \bar{\mathbf{x}}}{\omega \mathbf{h}_{2}^{\top} \bar{\mathbf{x}}} \\ &= \frac{\mathbf{h}_{1}^{\top} \bar{\mathbf{x}}}{\mathbf{h}_{2}^{\top} \bar{\mathbf{x}}} \\ &= \frac{h_{10} x + h_{11} y + h_{12}}{h_{20} x + h_{21} y + h_{22}} \\ \end{align}\]

Because

\[\begin{align} \tilde{\mathbf{x}}^{\prime} &= \tilde{\mathbf{H}} \tilde{\mathbf{x}} \\ &= \begin{bmatrix} \mathbf{h}_{0}^{\top} \\ \mathbf{h}_{1}^{\top} \\ \mathbf{h}_{2}^{\top} \\ \end{bmatrix} \tilde{\mathbf{x}} \\ &= \begin{bmatrix} \mathbf{h}_{0}^{\top} \tilde{\mathbf{x}} \\ \mathbf{h}_{1}^{\top} \tilde{\mathbf{x}} \\ \mathbf{h}_{2}^{\top} \tilde{\mathbf{x}} \\ \end{bmatrix} \\ &= \mathbf{h}_{2}^{\top} \tilde{\mathbf{x}} \begin{bmatrix} \frac{\mathbf{h}_{0}^{\top} \tilde{\mathbf{x}}}{\mathbf{h}_{2}^{\top} \tilde{\mathbf{x}}} \\ \frac{\mathbf{h}_{1}^{\top} \tilde{\mathbf{x}}}{\mathbf{h}_{2}^{\top} \tilde{\mathbf{x}}} \\ 1 \\ \end{bmatrix} \\ &= \mathbf{h}_{2}^{\top} \tilde{\mathbf{x}} \begin{bmatrix} x^{\prime} \\ y^{\prime} \\ 1 \\ \end{bmatrix} \\ &= \mathbf{h}_{2}^{\top} \tilde{\mathbf{x}} \bar{\mathbf{x}}^{\prime} \end{align}\]

We have

\[\begin{align} \bar{\mathbf{x}}^{\prime} &= \frac{\tilde{\mathbf{H}} \tilde{\mathbf{x}}}{\mathbf{h}_{2}^{\top} \tilde{\mathbf{x}}} \\ &= \frac{\tilde{\mathbf{H}} \bar{\mathbf{x}}}{\mathbf{h}_{2}^{\top} \bar{\mathbf{x}}} \end{align}\]

This transformation preserves the straight lines.


Proof


Suppose we have three distinct points $\mathbf{x}_1$, $\mathbf{x}_2$ and $\mathbf{x}_3$ from one line. After projective transformation, the points become $\mathbf{x}_1^{\prime}$, $\mathbf{x}_2^{\prime}$, and $\mathbf{x}_3^{\prime}$, respectively.


Because $\mathbf{x}_1$, $\mathbf{x}_2$ and $\mathbf{x}_3$ are on the same line, we must have

\[\mathbf{x}_1 - \mathbf{x}_2 = k(\mathbf{x}_1 - \mathbf{x}_3)\]

and

\[\bar{\mathbf{x}}_1 - \bar{\mathbf{x}}_2 = k(\bar{\mathbf{x}}_1 - \bar{\mathbf{x}}_3)\]

where $k$ is a non-zero value.


Then,

\[\bar{\mathbf{x}}_2 = (1 - k) \bar{\mathbf{x}}_1 + k \bar{\mathbf{x}}_3\] \[\begin{align} \bar{\mathbf{x}}_1^{\prime} - \bar{\mathbf{x}}_2^{\prime} &= \frac{\tilde{\mathbf{H}} \bar{\mathbf{x}}_1}{\mathbf{h}_{2}^{\top} \bar{\mathbf{x}}_1} - \frac{\tilde{\mathbf{H}} \bar{\mathbf{x}}_2}{\mathbf{h}_{2}^{\top} \bar{\mathbf{x}}_2} \\ &= \frac{(\mathbf{h}_{2}^{\top} \bar{\mathbf{x}}_2) \tilde{\mathbf{H}} \bar{\mathbf{x}}_1 - (\mathbf{h}_{2}^{\top} \bar{\mathbf{x}}_1) \tilde{\mathbf{H}} \bar{\mathbf{x}}_2}{ (\mathbf{h}_{2}^{\top} \bar{\mathbf{x}}_1) (\mathbf{h}_{2}^{\top} \bar{\mathbf{x}}_2)} \\ &= \frac{(\mathbf{h}_{2}^{\top} \bar{\mathbf{x}}_2) \tilde{\mathbf{H}} \bar{\mathbf{x}}_1 - (\mathbf{h}_{2}^{\top} \bar{\mathbf{x}}_1) \tilde{\mathbf{H}} \big[ (1 - k) \bar{\mathbf{x}}_1 + k \bar{\mathbf{x}}_3 \big]}{ (\mathbf{h}_{2}^{\top} \bar{\mathbf{x}}_1) (\mathbf{h}_{2}^{\top} \bar{\mathbf{x}}_2)} \\ &= \frac{\big[(\mathbf{h}_{2}^{\top} \bar{\mathbf{x}}_2) - (1 - k) (\mathbf{h}_{2}^{\top} \bar{\mathbf{x}}_1) \big] \tilde{\mathbf{H}} \bar{\mathbf{x}}_1 - k (\mathbf{h}_{2}^{\top} \bar{\mathbf{x}}_1) \tilde{\mathbf{H}} \bar{\mathbf{x}}_3 }{ (\mathbf{h}_{2}^{\top} \bar{\mathbf{x}}_1) (\mathbf{h}_{2}^{\top} \bar{\mathbf{x}}_2)} \\ &= \frac{\big[\mathbf{h}_{2}^{\top} \big((1 - k) \bar{\mathbf{x}}_1 + k \bar{\mathbf{x}}_3 \big) - (1 - k) (\mathbf{h}_{2}^{\top} \bar{\mathbf{x}}_1) \big] \tilde{\mathbf{H}} \bar{\mathbf{x}}_1 - k (\mathbf{h}_{2}^{\top} \bar{\mathbf{x}}_1) \tilde{\mathbf{H}} \bar{\mathbf{x}}_3 }{ (\mathbf{h}_{2}^{\top} \bar{\mathbf{x}}_1) (\mathbf{h}_{2}^{\top} \bar{\mathbf{x}}_2)} \\ &= \frac{\big[(1 - k) (\mathbf{h}_{2}^{\top} \bar{\mathbf{x}}_1) + k (\mathbf{h}_{2}^{\top} \bar{\mathbf{x}}_3) - (1 - k) (\mathbf{h}_{2}^{\top} \bar{\mathbf{x}}_1) \big] \tilde{\mathbf{H}} \bar{\mathbf{x}}_1 - k (\mathbf{h}_{2}^{\top} \bar{\mathbf{x}}_1) \tilde{\mathbf{H}} \bar{\mathbf{x}}_3 }{ (\mathbf{h}_{2}^{\top} \bar{\mathbf{x}}_1) (\mathbf{h}_{2}^{\top} \bar{\mathbf{x}}_2)} \\ &= \frac{k (\mathbf{h}_{2}^{\top} \bar{\mathbf{x}}_3) \tilde{\mathbf{H}} \bar{\mathbf{x}}_1 - k (\mathbf{h}_{2}^{\top} \bar{\mathbf{x}}_1) \tilde{\mathbf{H}} \bar{\mathbf{x}}_3 }{ (\mathbf{h}_{2}^{\top} \bar{\mathbf{x}}_1) (\mathbf{h}_{2}^{\top} \bar{\mathbf{x}}_2)} \\ &= \frac{k (\mathbf{h}_{2}^{\top} \bar{\mathbf{x}}_3)}{\mathbf{h}_{2}^{\top} \bar{\mathbf{x}}_2} \bigg( \frac{\tilde{\mathbf{H}} \bar{\mathbf{x}}_1 }{ \mathbf{h}_{2}^{\top} \bar{\mathbf{x}}_1} - \frac{\tilde{\mathbf{H}} \bar{\mathbf{x}}_3}{\mathbf{h}_{2}^{\top} \bar{\mathbf{x}}_3} \bigg) \\ &= \frac{k (\mathbf{h}_{2}^{\top} \bar{\mathbf{x}}_3)}{\mathbf{h}_{2}^{\top} \bar{\mathbf{x}}_2} (\bar{\mathbf{x}}_1^{\prime} - \bar{\mathbf{x}}_3^{\prime}) \end{align}\]

Because $k \neq 0$, $\mathbf{h}_{2}^{\top} \bar{\mathbf{x}}_2 \neq 0$, and $\mathbf{h}_{2}^{\top} \bar{\mathbf{x}}_3 \neq 0$ we must have

\[\bar{\mathbf{x}}_1^{\prime} - \bar{\mathbf{x}}_2^{\prime} = k^{\prime} (\bar{\mathbf{x}}_1^{\prime} - \bar{\mathbf{x}}_3^{\prime})\]

where $k^{\prime}$ a non-zero value and

\[k^{\prime} = \frac{k (\mathbf{h}_{2}^{\top} \bar{\mathbf{x}}_3)}{\mathbf{h}_{2}^{\top} \bar{\mathbf{x}}_2}\]

This concludes the proof.

Chain Plane Transformation

The 2D plane transformation using $2 \times 3$ transformation matrix

\[\mathbf{x}^{\prime} = \begin{bmatrix} \mathbf{S}^{2\times2} & \mathbf{t}^{2\times1} \\ \end{bmatrix} \bar{\mathbf{x}}\]

could also be expressed with $3 \times 3$ transformation matrix.

\[\begin{align} \bar{\mathbf{x}}^{\prime} &= \begin{bmatrix} \mathbf{x}^{\prime} \\ 1 \\ \end{bmatrix} \\ &= \begin{bmatrix} \mathbf{S} \mathbf{x} + \mathbf{t} \\ 1 \\ \end{bmatrix} \\ &= \begin{bmatrix} \begin{bmatrix} \mathbf{S} & \mathbf{t} \\ \end{bmatrix} \begin{bmatrix} \mathbf{x} \\ 1 \\ \end{bmatrix} \\ 1 \\ \end{bmatrix} \\ &= \begin{bmatrix} \begin{bmatrix} \mathbf{S} & \mathbf{t} \\ \end{bmatrix} \begin{bmatrix} \mathbf{x} \\ 1 \\ \end{bmatrix} \\ \begin{bmatrix} \mathbf{0} & 1 \\ \end{bmatrix} \begin{bmatrix} \mathbf{x} \\ 1 \\ \end{bmatrix} \\ \end{bmatrix} \\ &= \begin{bmatrix} \mathbf{S} & \mathbf{t} \\ \mathbf{0} & 1 \\ \end{bmatrix} \begin{bmatrix} \mathbf{x} \\ 1 \\ \end{bmatrix} \\ &= \begin{bmatrix} \mathbf{S} & \mathbf{t} \\ \mathbf{0} & 1 \\ \end{bmatrix} \bar{\mathbf{x}} \\ \end{align}\]

where

\[\begin{bmatrix} \mathbf{S} & \mathbf{t} \\ \mathbf{0} & 1 \\ \end{bmatrix}\]

is the $3 \times 3$ transformation matrix and it can be chained with other $3 \times 3$ transformation matrices easily with matrix multiplication.


For example,

\[\begin{align} \bar{\mathbf{x}}^{\prime} &= \begin{bmatrix} \mathbf{S}_n & \mathbf{t}_n \\ \mathbf{0} & 1 \\ \end{bmatrix} \cdots \begin{bmatrix} \mathbf{S}_2 & \mathbf{t}_2 \\ \mathbf{0} & 1 \\ \end{bmatrix} \begin{bmatrix} \mathbf{S}_1 & \mathbf{t}_1 \\ \mathbf{0} & 1 \\ \end{bmatrix} \bar{\mathbf{x}} \\ \end{align}\]


Also notice that the $3 \times 3$ transformation matrix can be applied to not only the augmented coordinates $\bar{\mathbf{x}}$, but also the homogeneous coordinates $\tilde{\mathbf{x}}$.

\[\begin{align} \tilde{\mathbf{x}}^{\prime} &= \tilde{w} \bar{\mathbf{x}}^{\prime} \\ &= \tilde{w} \begin{bmatrix} \mathbf{S} & \mathbf{t} \\ \mathbf{0} & 1 \\ \end{bmatrix} \bar{\mathbf{x}} \\ &= \begin{bmatrix} \mathbf{S} & \mathbf{t} \\ \mathbf{0} & 1 \\ \end{bmatrix} (\tilde{w} \bar{\mathbf{x}}) \\ &= \begin{bmatrix} \mathbf{S} & \mathbf{t} \\ \mathbf{0} & 1 \\ \end{bmatrix} \tilde{\mathbf{x}} \\ \end{align}\]