Higher derivatives
In this lecture we return back to functions of one variable, defined on an open or closed interval on the real axis.
Definition. A function
is called twice (two times) differentiable, if it is differentiable at all points of the segment
, and the derivative
considered as a scalar (numeric) function on this segment, is also differentiable.
Iterating this construction, we say that a function is
times differentiable, if it is differentiable and its derivative is
times differentiable. This is an inductive definition.
Variations. Sometimes it is convenient to say that a function is 0 times differentiable, if it is continuous on the segment. If this agreement is used as the base of induction, then this would define a slightly more restricted classes of
times differentiable functions, usually denoted by
.
Exercise.
- Give non-polynomial example of a function which is infinitely many times differentiable.
- Give example of a function that has exactly 7 derivatives, but not 8, on the segment
.
As was already established, existence of the first derivative allows to construct the linear approximation for a given function. If this approximation is not degenerate (i.e., the derivative is non-vanishing), it allows to study the extremal properties of the function, in particular,
- Guarantee a certain type of the extremum at the endpoints
of the segment;
- Guarantee the absence of extremum at the interior points of the interval
.
It turns out that higher order derivatives allow to construct approximation of functions by polynomials of higher order, and this approximation sometimes guarantees presence/type or absence of extremum. In cases it does not, one needs more of the same.
Theorem. If a function
is
times differentiable on a segment
containing the origin
, then there exists a unique polynomial
of degree
which gives an approximation of
at the origin with the accuracy
,

The coefficients
of this polynomial are proportional to the respective higher derivatives
at the origin,
.
Remark. There is nothing mysterious in the formulas: if
is a polynomial itself, then the higher order derivatives can be easily computed for each monomial separately:
.
This proves the formulas for the coefficients
via the derivatives of the polynomial.
For the proof we need the following lemma, which is very much like the intermediate value theorem.
Lemma (on finite differences). For a function differentiable on the interval
the normalized finite difference
coincides with the derivative
at some intermediate point
.
Proof of the Lemma. Consider the auxiliary function
. Then
and the same finite difference for this function is equal to zero. Since the function takes equal values at the endpoints, either its maximum or its minimum is achieved at some point
inside the interval
. By the Fermat rule,
. By construction,
.
Proof of the Theorem. The formulas for the coefficients imply that the difference
is a function which is
times differentiable and all its derivatives vanish at the origin. We have to prove that
.
For
this is obvious: by definition of differentiability, for function with zero derivative we have
. Reasoning by induction, consider the function
and its derivative
. By the inductive assumption,
for an arbitrary
on a sufficiently small interval.
By the Lemma,
for some
. Using the inequality
, we conclude that
. Since
can be arbitrary small, we conclude that the limit
is smaller than any number, i.e., must be zero.
Definition. The polynomial
above is called the Taylor polynomial of order
for the function
at the origin
. One can easily modify this to become the definition if the Taylor polynomial at an arbitrary point
.
Application to the investigation of functions
Any
times differentiable function can be written (near the origin, but this is not a restriction!) as its Taylor polynomial of degree
plus an error term which is fast decreasing (faster than the highest degree term of the polynomial). Hence this term cannot affect the extremal properties of the polynomial. In particular, if
and the Taylor polynomial has a minimum (maximum) at
, so does the function itself.For quadratic polynomials the horns of the parabola go up (resp., down) if its principal coefficient is positive (resp., negative). This immediately proves the following result.
Second-order sufficient condition.If
is a twice differentiable function and
is a critical point,
, then the following holds:
- If
, then
is a local minimum,
- If
, then
is a local maximum,
- If
, everything depends on the Taylor polynomial of degree 3.
Problem. Find conditions for a degree 3 polynomial
to have a local maximum/minimum on (a) interval
, (b) semi-interval
. Formulate the third order necessary/sufficient conditions for extremum in the interior point (resp., left/right endpoint) of the domain of a non-polynomial function
.