# Lecture Notes

Below are the lecture notes from my course on multivariable calculus, as well as some handouts I give to students for some of the experiments in the introductory physics lab courses. Note that I update my notes each term, but this website may not link to the latest editions.

### Notes on MTH3015: Calculus III

Visualizing 3D Curves with WolframAlpha

Week 3: Vector-valued Functions; Curves

Week 4: Functions of Several Variables; Limits and Continuity

Week 5: Partial Derivatives; The Chain Rule

Week 7: Directional Derivative and Gradient

Week 8: Taylor Series and Extrema

**Week 8: Taylor Series and Extrema**

Sujeet Akula

Taylor's Theorem

The Taylor series expansion is a powerful tool for performing numerical approximations. We will discuss it in some detail here. Consider a function of two variables, $f(x,y)$. We define a function $F(t) = f(a+th,b+tk)$, which we used to define the directional derivative $D_{\vect{u}}f(x,y) = F'(t)$, for $\vect{u} = (h,k)$ and $||\vect{u}|| = 1 \Leftrightarrow h^2 + k^2 = 1$. Since $F(t)$ is a scalar function of a single variable, we already know how to expand it, from Calculus I and II. The Taylor-Maclaurin expansion of $g(x)$ is: \begin{equation} g(x) = \sum_{k=0}^\infty \frac{g^{(k)}(0)}{k!}x^k \mathrm{.} \end{equation} (Remember that the Taylor-Maclaurin expansion is just the Taylor expansion around $x=0$.) If $F(t)$ is $n$ times differentiable on the interval $I=[0,1]$, then we can write the first $n$ terms of the Taylor-Maclaurin expansion and the error for $F(1)$: \begin{equation} F(1) = F(0) + F'(0) + \frac{F''(0)}{2!} + \dots + \frac{F^{(n-1)}(0)}{(n-1)!} + \frac{F^{(n)}(s)}{n!} \mathrm{,} \end{equation} where $s\in (0,1)$. (The last term is called the error term.) We already determined that \begin{align} F'(t) &= hf_x(a+th,b+tk) + kf_y(a+th,b+tk) \mathrm{,} \end{align}so clearly,\begin{align} F'(0) &= hf_x(a,b) + kf_y(a,b) \mathrm{.} \end{align}We now compute the second derivative:\begin{align} F''(t) = h^2f_{xx}(a+th,&b+tk) +k^2f_{yy}(a+th,b+tk) + 2hkf_{xy}(a+th,b+tk) \mathrm{,} \end{align}giving,\begin{align} F''(0) &= h^2f_{xx}(a,b) +k^2f_{yy}(a,b) + 2hkf_{xy}(a,b) \mathrm{.} \end{align} The higher derivatives are found the same way by applying the chain rule again. We can now write Taylor's Theorem for a linear approximation of $f(x,y)$.

**Taylor's Theorem** Let $f: \D \to \R$, $\D \subseteq \Rtwo$. Let $B((a,b),\delta)$
be an open disk centered at $(a,b)$ of radius $\delta$, and $B((a,b),\delta) \subseteq \D$. Let
$f$ be twice continuously differentiable in $B((a,b),\delta)$. For every
$(a+h,b+k)\in B((a,b),\delta)$, $\exists s \in (0,1)$ such that
\begin{equation}
f(a+h,b+k) = f(a,b) + hf_x(a,b) + kf_y(a,b) + \frac{1}{2}\left(h^2f_{xx}(u,v) + k^2f_{yy}(u,v) + 2hkf_{xy}(u,v)\right),
\end{equation}
where $u = a + sh$, and $v = b + sk$.
We now present a theorem useful for proving the convergence of Taylor expansions.

**Theorem** Let $f: \D \to \R$, $\D \subseteq \Rtwo$. Suppose that there is a constant $M$,
such that
\begin{equation}
\left|h^2f_{xx}(x,y) + 2hkf_{xy}(x,y) + k^2f_{yy}(x,y)\right| \le M ,
\end{equation}
for all $(h,k)$ that satisfies $h^2+k^2<\delta^2$ and $(x,y)\in B((a,b),\delta)$. Then, for every
$(x,y)\in B((a,b),\delta)$,
\begin{equation}
\left|f(x,y)-f(a,b)-f_x(a,b)(x-a)-f_y(a,b)(y-b)\right| \le \frac{1}{2}M .
\end{equation}

Extrema

In this section, we first present the definition of locally minimal and maximal points of a function of several variables, but restrict ourselves to two variables, for the sake of simplicity, though everything may be extended to three variables. Next, we discuss the method of determining whether a candidate critical point is actually an extremum or a meta-stable point.

Maxima and Minima

We define local maxima and minima.

**Definition** A function $f(x,y)$ of two variables has a local maximum at
$(a,b)$ if there is a $\delta>0$ such that $f(x,y) \le f(a,b)$ for every $(x,y)$ in the
neighborhood disk $B((a,b),\delta)$, and $f$ is defined on $B((a,b),\delta)$. The number
$f(a,b)$ is called a local maximum value. If $f(x,y) \ge f(a,b)$ for $(x,y)$ in the
disk, then $f$ has a local minimum at $(a,b)$, and $f(a,b)$ is a local minimum value.
Now that we have defined local minima and maxima, we must come up with a method of finding them. To
this end, we provide a theorem that serves as a necessary condition for a point to be a local
minimum or maximum.

**Theorem** Let $f:\D\to\R$ be continuously differentiable on a subset $\D$ of $\Rtwo$.
If $f$ has a local minimum or maximum at $(a,b)$ then
\begin{equation}
f_x(a,b) = f_y(a,b) = 0 . \label{critpt}
\end{equation}
The solutions $(a,b)$ to equation Eq. \eqref{critpt} are called critical points of $f$.
I reiterate here that the above theorem is only a necessary and not sufficient condition for
local minima and maxima. This is to say that while every local minimum or maximum satisfies
the theorem, not every point that does is a local minimum or maximum.

**Example** Find all critical points of $f(x,y)$ given by the formula
\[ f(x,y) = \frac{1}{2}x^2 - 4xy + 9y^2 . \]
The critical points of $f$ are the solutions to the system of equations:
\[ f_x = x-4y=0 \text{, and } f_y = -4x+18y = 0 . \]
The only solution to this system is $(0,0)$. This is the only critical point of $f$.

Second Derivative Test

Suppose that we have the following formula for $f$: \begin{align} f(x,y) &= Ax^2 + 2Bxy + Cy^2 \mathrm{,} \end{align}where $A \ne 0$. Then, we see that (0,0) is a critical point of $f$. If we rewrite $f$ in the form\begin{align} f(x,y) &= A\left(x + \frac{B}{A}y\right)^2 + \frac{y^2}{A}\left(AC - B^2\right) \mathrm{,} \end{align} we can determine the nature of the critical point. If $A>0$ and $(AC-B^2) > 0$, then $f(x,y) > 0$ for every $(x,y) \ne (0,0)$, so (0,0) would clearly be a local minimum. If instead we that $A<0$ but still $(AC-B^2) > 0$, then $f(x,y) < 0$ for every $(x,y) \ne (0,0)$, which would mean that (0,0) is a local maximum. One can approximate an arbitrary (but twice continuously differentiable) $f(a+h,b+k)$ to linear order via a Taylor expansion and use the result from above to prove the so-called second derivative test theorem.

**Theorem (Second Derivative Test)** Let $f:\D\to\R$ be twice continuously differentiable,
and let $(a,b)$ be a critical point of $f$. Let $A = f_{xx}(a,b)$, $B=f_{xy}(a,b)$, and
$C=f_{yy}(a,b)$.

- If $A>0$ and $(AC-B^2)>0$ then $f$ has a local minimum at $(a,b)$.
- If $A<0$ and $(AC-B^2)>0$ then $f$ has a local maximum at $(a,b)$.
- If $(AC-B^2)<0$ then $f$ does not have a local maximum or a local minimum at $(a,b)$, and $(a,b)$ is called a saddle point of $f$.
- If $AC-B^2 = 0$, then the test is inconclusive (this is an exceptional case).

**Example** Given that $(0,0)$ is a critical point of $f(x,y)$ from the example in the
previous section and given by the formula
\[ f(x,y) = \frac{1}{2}x^2 - 4xy + 9y^2 , \]
determine whether $(0,0)$ is a local minimum, local maximum, or a saddle point. \\
We begin by computing $A = f_{xx}(0,0) = 1$, $B = f_{xy}(0,0) = -4$, and $C=f_{yy}(0,0)=18$,
as in the second derivative test. We see that $AC-B^2 = 2 > 0$. Next, we see that $A=1>0$, thus
$(0,0)$ is a local minimum.

Absolute Extrema

Absolute extrema are a relatively simple concept to understand once the local extrema are understood. We present the definition here.

**Definition** Let $f:\D\to\R$ be a function defined on a subset $\D$ of $\Rtwo$. We
say that $f$ attains an absolute minimum on $\D$ at $(a,b)$ if $f(x,y)\ge f(a,b)$ for
all $(x,y)$ in $\D$. If $f(x,y)\le f(a,b)$ for all $(x,y)$ in $\D$ then we say that $f$
attains an absolute maximum on $\D$ at $(a,b)$. The number $f(a,b)$ would be called the
absolute minimum value or the absolute maximum value.

Week 9: Double Integrals; Polar Integrals

Week 10: Triple Integrals; Cylindrical and Spherical Co-ordinates

Week 11: The Transformation Theorem

### Notes on IPL/CPS Physics Labs

Experiment 12: The Simple Pendulum