Home » Teaching

Lecture Notes

Below are the lecture notes from my course on multivariable calculus, as well as some handouts I give to students for some of the experiments in the introductory physics lab courses. Note that I update my notes each term, but this website may not link to the latest editions.


Notes on MTH3015: Calculus III

Visualizing 3D Curves with WolframAlpha

Week 1: Introduction to 2D and 3D co-ordinate spaces; introduction to vector, vector spaces, vector operations

Week 2: Parametrization

Week 3: Vector-valued Functions; Curves

Week 4: Functions of Several Variables; Limits and Continuity


Week 5: Partial Derivatives; The Chain Rule

$\newcommand{\vect}[1]{{\mathbf{#1}} }$ $\renewcommand{\vec}[1]{{\overrightarrow{#1}} }$ $\newcommand{\norm}[1]{{\left|\left|#1\right|\right|} }$ $\newcommand{\D}{{{\mathcal D}} }$ $\newcommand{\C}{{{\mathcal C}} }$ $\newcommand{\R}{{\mathbb{R}} }$ $\newcommand{\Rtwo}{{\mathbb{R}^2} }$ $\newcommand{\Rthree}{{\mathbb{R}^3} }$ $\newcommand{\V}{{\mathbb{V}} }$ $\newcommand{\nhat}{{\vect{\hat{n}}} }$ $\newcommand{\vi}{{\vect{i}} }$ $\newcommand{\vj}{{\vect{j}} }$ $\newcommand{\vk}{{\vect{k}} }$ $\newcommand{\f}{{\vect{f}} }$ $\newcommand{\g}{{\vect{g}} }$ $\newcommand{\vz}{{\vect{0}} }$ $\DeclareMathOperator{\sgn}{sgn}$ $\DeclareMathOperator{\dist}{dist}$

Week 5: Partial Derivatives and the Chain Rule

Sujeet Akula

Partial Derivatives

We will discuss 'partial derivatives' of functions of several variables. Consider $f(x,y)$, a function of two variables. There are two possible first-order derivatives: the derivative with respect to $x$ and the derivative with respect to $y$. In these cases, when the rate of change of a specific variable is considered, while the other variables are constant, the derivative is called a partial derivative. Let's begin with an example formula for $f$: \begin{equation} f(x,y) = 3x^2y - 4y^2 ,\end{equation} and let's consider the point $(1,2)$. The two derivatives with respect to $x$ and $y$ at this point are denoted as \begin{equation} f_x(1,2) , f_y(1,2) , \end{equation} respectively. They are also denoted as \begin{equation} \frac{\partial f}{\partial x}(1,2) , \frac{\partial f}{\partial y}(1,2) . \end{equation} Let's see how these are computed, beginning with $f_x(1,2)$. Since we are considering the rate of change of $f$ for variations in $x$ around $x=1$, but fixed on $y=2$, so we begin by considering the function of a single variable \begin{equation} g(x) = f(x,2) = 6x^2 -16 . \end{equation} Now we compute the derivative as usual: \begin{equation} g'(x) = 12x . \end{equation} Since then $g'(1) = 12$, we have that \begin{equation} f_x(1,2) = 12 . \end{equation} This means that the partial derivative with respect to $x$ at the point $(1,2)$ is 12. Geometrically, if we consider the graph of the surface $f(x,y)$, the intersection of this surface with the plane $y=2$ forms a curve. The meaning of $f_x(1,2)$ is the slope of this curve at the point $(1,2)$. Let's now compute $f_y(1,2)$, in a similar manner. We begin by defining \begin{equation} h(y) = f(1,y) = 3y-4y^2 . \end{equation} As one might expect, \begin{equation} f_y(1,2) = h'(2) = -13 . \end{equation}

Definition of the Partial Derivative

We will now formalize this idea, with the definition of a partial derivative.

Definition Let $f:\D\to\R$ be a function defined on $\D$, where $\D\subseteq\Rtwo$. Let $u_0=(a,b)$ be a point in $\D$ such that all points 'close' to $u_0$ are also in $\D$. That is, we require that for this point, there exists a $\delta>0$ such that every point in $B(u_0,\delta)$ is also contained in $\D$. Then, the partial derivative of $f$ with respect to $x$ at $u_0$ is defined as \begin{equation} f_x(u_0) = \lim\limits_{\Delta x\to0}\frac{f(a+\Delta x,b)-f(a,b)}{\Delta x} . \end{equation} And, the partial derivative of $f$ with respect to $y$ at $u_0$ is defined as \begin{equation} f_y(u_0) = \lim\limits_{\Delta y\to0}\frac{f(a,b+\Delta y)-f(a,y)}{\Delta y} . \end{equation} The language in the definition about the points 'close to' $u_0$ ensures that $u_0$ is not a boundary point of $\D$, which is important for the definition. Note that the limits in this definition are of scalar function of single variables, since each limit only involves either $\Delta x$ or $\Delta y$. As one might expect, the partial derivatives only exist if the corresponding the limit exists. If the partial derivative $f_x$ or $f_y$ exists for all points in $\D$ or some subset of $\D$, then it in turn becomes a function defined on $\D$ or that subset of $\D$. For example, we had been discussing $f(x,y) = 3x^2y - 4y^2$. In this case, we can compute the function $f_x(x,y)$. To do this, we treat $y$ as a constant, and follow the usual differentiation rules, giving \begin{equation} f_x(x,y) = 6xy . \label{fx} \end{equation} Let's break this down carefully: the first term is $3x^2y$, which can be re-written as $(3y)x^2$. Since the factor $(3y)$ is a constant with respect to $x$, we only differentiate $x^2$, which gives $2x$, following the usual polynomial rule. So, the first term becomes $(3y)(2x) = 6xy$. As for the second term, $4y^2$, the whole term is a constant with respect to $x$, so when this is differentiated with respect to $x$, the constant rule tells us that it just becomes a zero, which is how we arrive at Eq. \eqref{fx}. Similarly, we can compute $f_y(x,y)$. The logic is the same as before, and the only difference is in execution: we differentiate with respect to $y$, and we treat $x$ as a constant, which gives \begin{equation} f_y(x,y) = 3x^2 - 8y . \end{equation} From the definition of the partial derivative, all of the usual rules follow: the product rule, the sum rule, the quotient rule. Let's now consider some examples.

Example Compute the first-order derivatives of \[ f(x,y) = xe^x\cos(xy^2) . \] Since this is a function of two variables, there are two first-order derivatives to compute: $f_x(x,y)$ and $f_y(x,y)$. Let's begin with $f_x(x,y)$: \begin{align} f_x(x,y) &= \frac{\partial}{\partial x}\left[xe^x\cos\left(xy^2\right)\right] \\ &= (1)(e^x\cos(xy^2)) + x\frac{\partial}{\partial x}\left[e^x\cos\left(xy^2\right)\right] \text{, by the product rule}\\ &= e^x\cos\left(xy^2\right) + x\left(\left(e^x\cos\left(xy^2\right)\right) + e^x\frac{\partial}{\partial x}\left[\cos\left(xy^2\right)\right]\right) \text{, by the product rule}\\ &= e^x\cos\left(xy^2\right) + x\left(\left(e^x\cos\left(xy^2\right)\right) + e^x\left(-\sin\left(xy^2\right)\right)\frac{\partial}{\partial x}\left[xy^2\right]\right) \text{, by the chain rule}\\ &= e^x\cos\left(xy^2\right) + x\left(\left(e^x\cos\left(xy^2\right)\right) -e^x\sin\left(xy^2\right)\left(y^2\right)\right)\\ \therefore f_x(x,y) &= \left((x+1)\cos\left(xy^2\right) - xy^2\sin\left(xy^2\right)\right)e^x \text{, after some factoring.} \end{align} Now for $f_y(x,y)$: \begin{align} f_y(x,y) &= \frac{\partial}{\partial y}\left[xe^x\cos\left(xy^2\right)\right] \\ &= \left(xe^x\right)\frac{\partial}{\partial y}\left[\cos\left(xy^2\right)\right] \text{, since $(xe^x)$ is constant with respect to $y$}\\ &= \left(xe^x\right)\left(-\sin\left(xy^2\right)\right)\frac{\partial}{\partial y}\left[xy^2\right] \text{, by the chain rule}\\ &= -\left(xe^x\right)\sin\left(xy^2\right)\left(2xy\right) \\ \therefore f_y(x,y) &= -2yx^2e^x\sin\left(xy^2\right) \text{, after some simplification.}\\ \end{align} Extending these results to functions of three variables is trivial. For $f(x,y,z)$, there will be three first-order derivatives possible at a point: \begin{equation} f_x = \frac{\partial f}{\partial x} , f_y = \frac{\partial f}{\partial y} \text{ , and } f_z = \frac{\partial f}{\partial z} . \end{equation} Let's consider an example for this case.

Example Compute the first-order derivatives of \[ f(x,y,z) = x\ln\left(x^2+y^2\right)+y\ln\left(y^2+z^2\right) . \] We will not work out this example with as much detail, but will complete the calculations. \begin{align} f_x(x,y,z) &= \ln\left(x^2 + y^2\right) + \frac{2x^2}{x^2 + y^2} \\ f_y(x,y,z) &= \frac{2xy}{x^2+y^2} + \ln\left(y^2 + z^2\right) + \frac{2y^2}{y^2 + z^2} \\ f_z(x,y,z) &= \frac{2yz}{y^2+z^2} \end{align}

Higher Order Partial Derivatives

So far we have been discussing first-order partial derivatives. To introduce partial derivatives of higher orders, we begin with the second-order. Consider $f(x,y)$ for which we define \begin{equation} f_{xx} = \left(f_x\right)_x , \end{equation} which is also denoted as \begin{equation} \frac{\partial^2 f}{\partial x^2} = \frac{\partial}{\partial x}\left[\frac{\partial f}{\partial x}\right] . \end{equation} This is the partial derivative of the function $f_x(x,y)$ with respect to $x$. Similarly, we have \begin{equation} f_{yy} = \left(f_y\right)_y \iff \frac{\partial^2 f}{\partial y^2} = \frac{\partial}{\partial y}\left[\frac{\partial f}{\partial y}\right] . \end{equation} These are more or less as one would expect. The interesting thing is that partial derivatives have also the 'mixed partial derivatives' or 'mixed partials': \begin{equation} f_{xy} = \left(f_x\right)_y \iff \frac{\partial^2 f}{\partial y\partial x} = \frac{\partial}{\partial y}\left[\frac{\partial f}{\partial x}\right] , \end{equation} and in the other order \begin{equation} f_{yx} = \left(f_y\right)_x \iff \frac{\partial^2 f}{\partial x\partial y} = \frac{\partial}{\partial x}\left[\frac{\partial f}{\partial y}\right] . \end{equation} These are the second order partial derivatives for a function of two variables. In general, we could consider the partial derivatives of order $n$ of some function $f(x,y)$. If we choose $n=3$, we could possibly have \[ f_{xxx} , f_{xxy} , f_{xyx} , f_{xyy} , f_{yxx} , f_{yxy} , f_{yyx} , f_{yyy} .\] That is, for a function of two variables, and order $n=3$, we have 8 possible partial derivatives of order $n$. Notice that here, $2^3 =8$ is the number of possible partial derivatives. This is a special case of the following statement. For a function $f:\D\to\R$, where $\D\subseteq\R^m$, there are $m^n$ possible partial derivatives of order $n$. (In this statement $f$ is a function of $m$ variables.)

Definition Let $f:\D\to\R$ be a function defined on a subset $\D$ of $\R^m$. We say that $f$ is $n$ times continuously differentiable on $\D$ if all partial derivatives of order up to and including $n$ exist at every point in $\D$, and are continuous function on $\D$. As a technical point, we have to require that $\D$ in the above definition must be an open subset of $\R^m$ because otherwise the partial derivatives would not be well-defined at all points in $\D$. We will not define the meaning of a function $f$ being differentiable at any given point. These sorts of statements form the basis of the advanced study of the mathematical course 'Analysis' which is also formally known as 'The Theory of Function of a Real Variable.' The above definition will be the only definition of differentiability that we will consider in this course.

Example Compute all derivatives of the first and second order of the function \[ f(x,y) = x\sin\left(x+y\right) . \] First, we should recognize that this is a function of two variables. This means that there are $2^1=2$ partial derivatives of the first order, and $2^2=4$ partial derivatives of the second order. We must begin with the first order: \begin{align} f_x(x,y) &= \sin(x+y) + x\cos(x+y) , \\ f_y(x,y) &= x\cos(x+y) . \end{align} And for the second order, we have: \begin{align} f_{xx}(x,y) &= \cos(x+y) + \cos(x+y) - x\sin(x+y) , \\ f_{yy}(x,y) &= -x\sin(x+y) , \\ f_{xy}(x,y) &= \cos(x+y) - x\sin(x+y) , \\ f_{yx}(x,y) &= \cos(x+y) - x\sin(x+y) . \end{align} We can see that all of these partial derivatives are defined and continuous on $\Rtwo$. Therefore, we have directly shown that $f$ is at least twice continuously differentiable. In fact, by understanding that every derivative of $f$ will be a combination of polynomials and trigonometric functions, one can show that $f$ is $n$ times continuously differentiable, for every $n$. Also, we see that $f_{xy}$ and $f_{yx}$, the two mixed partials are equal. This is a special case of the very famous and important theorem to follow.

Theorem (Clairaut's Theorem) Let $f:\D\to\R$ be defined and twice continuously differentiable on a subset $\D$ of $\Rtwo$. Then, $f_{xy} = f_{yx}$ on $\D$. I am tempted to give a proof of this very important theorem, but instead I leave it as an exercise to the most interested students, with the following sketch: begin by computing the first order partial derivatives according to the definition of partial derivatives and then compute the two mixed partials by applying the mean value theorem several times (four times, exactly). As one might expect, this theorem does generalize to higher order derivatives and functions of more variables.

Laplace's Equation and Harmonic Functions

Just because it is convenient to do so at this point, we introduce Laplace's equation, which gives the simple definition of a 'harmonic function.' Laplace's equation is a type of partial differential equation. For a function $u(x,y)$ of two variables, it is written as \begin{equation} u_{xx} + u_{yy} = 0 . \end{equation} For a function $u(x,y,z)$ of three variables, it is written as \begin{equation} u_{xx} + u_{yy} + u_{zz} = 0 . \end{equation} Any function that satisfies Laplace's equation is called a harmonic function. These functions are tremendously important in Physics, as many physical systems are described an equation that either is or may be reduced to Laplace's equation.

Example Show that the function $u(x,y) = e^x\sin y$ is a harmonic function. We begin by computing the first-order partial derivatives \[ u_x = e^x\sin y , u_y = e^x\cos y . \] Then, we compute the second-order partial derivatives \[ u_{xx} = e^x\sin y , u_{yy} = -e^x\sin y .\] We can see that this satisfies \[ u_{xx} + u_{yy} = 0 \] therefore $u(x,y)$ is a harmonic function.

The Chain Rule

We define now the chain rule for functions of two variables in the following theorem.

Theorem (The Chain Rule) Let $f:\D\to\R$ be a function which is defined and continuously differentiable on a subset $\D$ of $\Rtwo$. Let $g,h:I\to\R$ be function of one variable that are differentiable on an open interval $I$ such that the points given by $(g(t),h(t))$ lie in $\D$ for all $t$ in $I$. Then, the composite function $F(t)=f(g(t),h(t))$ is differentiable and its derivative is \begin{equation} F'(t) = f_x(g(t),h(t))g'(t) + f_y(g(t),h(t))h'(t) . \end{equation} This is also written more conveniently as \begin{equation} \frac{d}{dt}f(x(t),y(t)) = \frac{\partial f}{\partial x}\frac{dx}{dt} + \frac{\partial f}{\partial y}\frac{dy}{dt} , \end{equation} which is a form that is more analogous to the chain rule of a function of one variable. The main difference is that when considering functions of several variables, we have to sum over terms that look like the one-variable chain rule terms.

Example Compute $\dfrac{d}{dt}f(\cos t,\sin t)$ for the following formula for $f$: \[ f(x,y) = 2xy + x^3 -y . \] Here, we are basically saying that $x(t) = \cos t$ and $y(t) = \sin t$. We can of course substitute these into $f$, and compute the derivative of the one-variable function $F(t) = f(\cos t,\sin t)$, but instead let's use the chain rule since it would seem topical here and it reads: \[ \frac{d}{dt}f(\cos t,\sin t) = f_x(\cos t, \sin t)\frac{dx}{dt} + f_y(\cos t, \sin t)\frac{dy}{dt} . \] First, we will compute $f_x$ and $f_y$: $f_x(x,y) = 2y+3x^2$ and $f_y(x,y) = 2x-1$. This of course means that \[ f_x(\cos t, \sin t) = 2\sin t + 3\cos^2t \text{, and } f_y(\cos t,\sin t) = 2\cos t -1 . \] Now all we need are $\dfrac{dx}{dt}$ and $\dfrac{dy}{dt}$ which are trivial: \begin{gather*} \frac{dx}{dt} = \frac{d}{dt}\cos t = -\sin t , \\ \frac{dy}{dt} = \frac{d}{dt}\sin t = \cos t . \end{gather*} Finally, we have the result (which you can check against manually computing $F'(t)$): \[ \frac{d}{dt}f(\cos t,\sin t) = -(2\sin t + 3 \cos^2t)\sin t + (2\cos t -1)\cos t .\] The chain rule can of course be extended to functions of more variables. For example, $f(x,y,z)$ is a function of three variables and in this case the chain rule would read \begin{equation} \frac{d}{dt}f(x(t),y(t),z(t)) = \frac{\partial f}{\partial x}\frac{dx}{dt} + \frac{\partial f}{\partial y}\frac{dy}{dt} + \frac{\partial f}{\partial z}\frac{dz}{dt} . \end{equation} Another generalization is instead of $x$ being a single-variable function like $x(t)$, we can consider $x$ and $y$ to give $f(x(s,t),y(s,t))$, so that $f(x,y)$, $x(s,t)$, and $y(s,t)$ are each functions of two variables. Then, the chain rule gives us \begin{align} \frac{\partial}{\partial s}f(x(s,t),y(s,t)) &= \frac{\partial f}{\partial x}\frac{\partial x}{\partial s} + \frac{\partial f}{\partial y}\frac{\partial y}{\partial s} , \\ \frac{\partial}{\partial t}f(x(s,t),y(s,t)) &= \frac{\partial f}{\partial x}\frac{\partial x}{\partial t} + \frac{\partial f}{\partial y}\frac{\partial y}{\partial t} . \end{align} It is left to the student to continue to generalize the chain rule in the case of $f(x(r,s,t),y(t),z(r,s))$, but the method is fairly consistent here.

Example Find the partial derivatives of \[ f(s^2 + t^2, s^3 - t^3) \] in terms of $f_x$ and $f_y$ (because you are not given a formula for $f$). \\ The chain rule gives: \[ \frac{\partial}{\partial s}f(s^2 + t^2, s^3 - t^3) = f_x(s^2+t^2,s^3-t^3)(2s) + f_y(s^2+t^2,s^3-t^3)(3s^2) , \] and \[ \frac{\partial}{\partial t}f(s^2 + t^2, s^3 - t^3) = f_x(s^2+t^2,s^3-t^3)(2t) + f_y(s^2+t^2,s^3-t^3)(-3t^2) , \] because \[ x_s = 2s , y_s = 3s^2 , x_t = 2t \text{, and } y_t = -3t^2 . \]




Week 7: Directional Derivative and Gradient

Week 8: Taylor Series and Extrema

Week 9: Double Integrals; Polar Integrals

Week 10: Triple Integrals; Cylindrical and Spherical Co-ordinates

Week 11: The Transformation Theorem


Notes on IPL/CPS Physics Labs

Lab Report Grading Structure

Experiment 9: Maxwell's Wheel

Experiment 12: The Simple Pendulum

Experiment 13: Simple Harmonic Motion

Experiment 14: Standing Waves

Experiment 16: Electric Field and Electric Potential