# Lecture Notes

Below are the lecture notes from my course on multivariable calculus, as well as some handouts I give to students for some of the experiments in the introductory physics lab courses. Note that I update my notes each term, but this website may not link to the latest editions.

### Notes on MTH3015: Calculus III

Visualizing 3D Curves with WolframAlpha

Week 3: Vector-valued Functions; Curves

Week 4: Functions of Several Variables; Limits and Continuity

Week 5: Partial Derivatives; The Chain Rule

Week 7: Directional Derivative and Gradient

Week 8: Taylor Series and Extrema

Week 9: Double Integrals; Polar Integrals

Week 10: Triple Integrals; Cylindrical and Spherical Co-ordinates

Week 11: The Transformation Theorem

**Week 11: The Transformation Theorem**

Sujeet Akula

Determinants

Matrices and determinants are typically covered at the introductory level in pre-calculus courses, so here I present a brief review of what is necessary to understand the Transformation Theorem, presented in the next section. A matrix is an object which may have $n$ rows and $m$ columns, with a total of $n\cdot m$ elements. We will only be discussing matrices that have real numbers as its elements. In the case that a matrix has the same number of rows as columns i.e., an $n\times n$ matrix, it is called a 'square matrix.' Square matrices have available an operation called the determinant, which is necessary for the Transformation Theorem. Suppose that we have a square $2\times2$ matrix, $A$: \begin{equation} A = \begin{pmatrix} a & b \\ c & d \end{pmatrix} , \end{equation} where $a$, $b$, $c$, and $d$ are real numbers. Then the determinant of $A$ is given by \begin{equation} \det A = \det \begin{pmatrix} a & b \\ c & d \end{pmatrix} = ad - bc . \end{equation} This is a good working definition of the determinant. In the case of a $3\times3$ matrix, $B$, the determinant can be given by \begin{equation} \det B = \det \begin{pmatrix} a & b & c \\ d & e & f \\ g & h & j \end{pmatrix} = a \det \begin{pmatrix} e & f \\ h & j \end{pmatrix} -b \det \begin{pmatrix} d & f \\ g & j \end{pmatrix} +c \det \begin{pmatrix} d & e \\ g & h \end{pmatrix} . \end{equation} So, the determinant of a $3\times3$ matrix can be reduced into determinants of $2\times2$ matrices, which we have already defined. It is crucial to notice that the second term in the determinant has a minus sign. This arises from an important property of the determinant called anti-symmetry, which you will study in greater detail in a course on linear algebra.

The Transformation Theorem

We study now one of the most important theorems for evaluating anti-derivatives. Recall that in the 1D case, when evaluating an integral of the sort \begin{equation} \int_a^b f(x) dx , \end{equation} we often find it convenient to introduce a new variable $u$, by making the substitution $x=g(u)$ so that $dx=g'(u) du$, giving \begin{equation} \int_a^b f(x) dx = \int_{g^{-1}(a)}^{g^{-1}(b)}f(g(u))g'(u) du . \end{equation} Now that we are considering integrals of functions of two and three variables, it is necessary to extend this concept. We begin by introducing substitutions of the form \begin{equation} x=F(u,v) , y=G(u,v) \text{ , in $\Rtwo$,} \end{equation} and \begin{equation} x=F(u,v,w) , y=G(u,v,w) , z=H(u,v,w) \text{ , in $\Rthree$.} \end{equation} It is not only necessary for these functions to be continuously differentiable, but also for the inverse of these functions to be continuously differentiable. Transformations composed of functions satisfying these properties are called 'diffeomorphisms.' The transformation may be labeled as $\T$, and it is composed of these functions. Thus, one may write that $\T=(F,G,H)$.

**Example** As an example, consider a new co-ordinate system that is just a rotation in
the plane by an angle $\alpha$. This is called an $SO(2,\R)$ transformation, where "$SO$'' stands
for "special orthogonal.'' The nomenclature of these transformations is far beyond the scope of this
course but one encounters these in a more abstract setting in the study of Group Theory.
Regardless, the $SO(2,\R)$ transformation may be represented by
\begin{equation}
x=F(u,v)=u\cos\alpha+v\sin\alpha , y=G(u,v)=-u\sin\alpha+v\cos\alpha .
\end{equation}
As we had alluded to earlier, using polar co-ordinates, cylindrical co-ordinates or spherical
co-ordinates are examples too.

**Example** The transformation to polar co-ordinates can be written as $\T=(F,G)$, where
\[ x=F(r,\theta)=r\cos\theta , y=G(r,\theta)=r\sin\theta . \]
We are nearly ready to introduce the Transformation Theorem--we just need to discuss the
'Jacobian.' The Jacobian some times refers to a matrix, and sometimes it refers to the determinant
of that matrix. Here, we will call the matrix explicitly the Jacobian matrix, and we will call its
determinant the Jacobian, denoted as $J_\T$. The Jacobian is given for a specific transformation.
If $\T_2=(F,G)$ is a 2D transformation and $F,G:\Rtwo\to\R$, then we write the Jacobian of $\T_2$
as
\begin{equation}
J_{\T_2}(u,v) = \det
\begin{pmatrix}
F_u(u,v) & F_v(u,v) \\
G_u(u,v) & G_v(u,v)
\end{pmatrix} .
\end{equation}
In 3D, we have $\T_3=(F,G,H)$, with $F,G,H:\Rthree\to\R$. Then, for $\T_3$, we have the Jacobian of
$\T_3$:
\begin{equation}
J_{\T_3}(u,v,w) = \det
\begin{pmatrix}
F_u(u,v,w) & F_v(u,v,w) & F_w(u,v,w) \\
G_u(u,v,w) & G_v(u,v,w) & G_w(u,v,w) \\
H_u(u,v,w) & H_v(u,v,w) & H_w(u,v,w)
\end{pmatrix} .
\end{equation}

**Theorem (Transformation Theorem)**
Let $\D$ be a closed Jordan measurable set. Let $\T$ be a diffeomorphic transformation on $\D$. If
$f:\T(\D)\to\R$ is continuous, then
\begin{equation}
\iint_{\T(\D)} f = \int_\D(f\circ\T)\left|J_\T\right| ,
\end{equation}
where $\left|J_\T\right|$ is the absolute value of the Jacobian for the transformation $\T$.

**Example**
Let us revisit the example of the transformation to polar co-ordinates, where the transformation
was given by $\T=(F,G)$ and these functions were defined by
\[ x=F(r,\theta)=r\cos\theta , y=G(r,\theta)=r\sin\theta . \]
In order to compute the Jacobian, we must compute the following partial derivatives: $F_r$,
$F_\theta$, $G_r$, and $G_\theta$. We find that
\[ F_r(r,\theta)=\cos\theta , F_\theta(r,\theta)=-r\sin\theta ,
G_r(r,\theta)=\sin\theta \text{ , and } G_\theta(r,\theta)=r\cos\theta .\]
Now, we have the elements of the Jacobian matrix, so we can compute the Jacobian itself:
\[
J_\T(r,\theta) = \det
\begin{pmatrix}
\cos\theta & -r\sin\theta \\
\sin\theta & r\cos\theta
\end{pmatrix}
= r\cos^2\theta - \left(-r\sin\theta\right)\left(\sin\theta\right)
= r\left(\cos^2\theta + \sin^2\theta\right)
= r
\]
Notice that the transformation theorem tells us that an integral in Cartesian co-ordinates may be
rewritten in polar co-ordinates according to the rule $dx dy \to \left|J_\T\right| dr d\theta = r dr d\theta$.
This is what we had been using last week, and now we see why!

### Notes on IPL/CPS Physics Labs

Experiment 12: The Simple Pendulum