Second partial derivative test

Second partial derivative test

In mathematics, the second partial derivatives test is a method in multivariable calculus used to determine if a critical point ("x", "y") is a minimum, maximum or saddle point.

Suppose that

:M = f_{xx}(a,b)f_{yy}(a,b) - left( f_{xy}(a,b) ight)^2

or in other words the determinant of a 2×2 Hessian matrix,

:M = egin{vmatrix}f_{xx}(a,b) &f_{xy}(a,b)\f_{yx}(a,b) &f_{yy}(a,b)end{vmatrix}.

If "M" > 0 and f_{xx}(a,b) > 0 then "f"("a", "b") is a local minimum.

If "M" > 0 and f_{xx} (a,b) < 0 then "f"("a", "b") is a local maximum.

If "M" < 0 then "f"("a", "b") has a saddle point.

If "M" = 0 then the second derivatives test is indecisive.

M and f_{xx}(a,b) are the leading principal minors of the Hessian. The conditions listed above (the sign of these values) are the conditions for the definiteness of the Hessian.

We only test values for which f_x = 0 and f_y = 0. This is rationalized because the function must on its trace along the "xz"-plane have its derivative equal to zero and the same is true for a trace on the "yz"-plane.

Geometric Interpretation

Assuming all derivatives are evaluated at ("a","b")

If "M" < 0 then f_{xx}f_{yy} < f_{xy}^2. If either f_{xx} or f_{yy} are negative, the other must be positive and thus the concavities of the x cross section and the y cross section are in opposite direction. This is clearly a saddle point.

If "M" > 0 then f_{xx}f_{yy} > f_{xy}^2, which implies that f_{xx} and f_{yy} are the same sign and sufficiently large. For this case the concavities of the x and y cross sections are either both up if positive, or both down if negative. This is clearly a local minimum or a local maximum, respectively.

This leaves the last case of "M" < 0 so f_{xx}f_{yy} < f_{xy}^2 and f_{xx} and f_{yy} having the same sign. The geometric interpretation of what is happening here is that since f_{xy} is large it means the slope of the graph in one direction is changing rapidly as we move in the orthogonal direction and overcoming the concavity of the orthogonal direction. So for example, let's take the case of all second derivatives are positive and ("a","b") = ("0","0"). In the case of "M" > 0 it would mean that any direction in the x-y plane we move from the origin, the value of the function increases--a local minimum. In the "M" < 0 case (f_{xy} sufficiently large), however, if we move at some direction between the x and y axis into the second quadrant, for example, of the xy plane, then despite the fact that the positive concavity would cause us to expect the value of the function to increase, the slope in the x direction is increasing even faster, which means that as we go left (negative x-direction) into the second quadrant, the value of the function ends up decreasing. Additionally, since the origin is a stationary point by hypothesis, we have a saddle point.

Examples

Find and label the critical points of the following function:

: z = (x+y)(xy + xy^2)

To solve this problem we must first find the first partial derivatives with respect to x and y of the function.

: frac{partial z}{partial x} = y(2x +y)(y+1) : frac{partial z}{partial y} = x left( 3y^2 +2y(x+1) + x ight)

Looking at

: frac{partial z}{partial x} left( x ight) = 0

we see that "y" must equal 0, −1 or -2x .

We plug this into the next equation, we get

: frac{partial z}{partial y} = x left( 3y^2 +2y(x+1) + x ight) = x^2

There were other possibilities for "y", so we have

: frac{partial z}{partial y} = x left( 3 -2(x+1) + x ight) = 0 = x(1-x)

So "x" must be equal to 1 or 0.

: frac{partial z}{partial y} = x left( 3(-2x)^2 +2(-2x)(x+1) + x ight) = 4x^2(2x-1)

:So "x" must equal 0 or frac{1}{2}

Let's list all the critical values now.

: (x,y) in {(0,0), (0, -1), (1,-1), (frac{1}{2}, -1)}

Now we have to label the critical values using the second derivative test.: D = f_{xx}(a,b)f_{yy}(a,b) - left( f_{xy}(a,b) ight)^2 = 2y(y+1)(2(3y+x+1))x - (3y^2+y(4x+2)+2x)^2Now we plug in all the different critical values we found to label them.

At (0, 0) we have "D" = 0, at (0, −1); "D" = −1, at (1, −1); "D" = −1, at

: (frac{1}{2}, -1) D = 0.

So we can now label some of the points, at (0, −1) and (1, −1) "f"("x", "y") has a saddle point. At the other two points we need higher order tests to find out what exactly the function is doing.


Wikimedia Foundation. 2010.

Игры ⚽ Поможем написать курсовую

Look at other dictionaries:

  • Second derivative test — In calculus, a branch of mathematics, the second derivative test is a criterion often useful for determining whether a given stationary point of a function is a local maximum or a local minimum.The test states: If the function f is twice… …   Wikipedia

  • test — 1. To prove; to try a substance; to determine the chemical nature of a substance by means of reagents. 2. A method of examination, as to determine the presence or absence of a definite disease or of some substance in any of the fluids, tissues,… …   Medical dictionary

  • Functional derivative — In mathematics and theoretical physics, the functional derivative is a generalization of the directional derivative. The difference is that the latter differentiates in the direction of a vector, while the former differentiates in the direction… …   Wikipedia

  • Symmetry of second derivatives — In mathematics, the symmetry of second derivatives (also called the equality of mixed partials) refers to the possibility of interchanging the order of taking partial derivatives of a function of n variables. If the partial derivative with… …   Wikipedia

  • Hessian matrix — In mathematics, the Hessian matrix (or simply the Hessian) is the square matrix of second order partial derivatives of a function; that is, it describes the local curvature of a function of many variables. The Hessian matrix was developed in the… …   Wikipedia

  • Maxima and minima — For other uses, see Maxima (disambiguation) and Maximum (disambiguation). For use in statistics, see Maximum (statistics). Local and global maxima and minima for cos(3πx)/x, 0.1≤x≤1.1 In mathematics, the maximum and minimum (plural: maxima and… …   Wikipedia

  • List of mathematics articles (S) — NOTOC S S duality S matrix S plane S transform S unit S.O.S. Mathematics SA subgroup Saccheri quadrilateral Sacks spiral Sacred geometry Saddle node bifurcation Saddle point Saddle surface Sadleirian Professor of Pure Mathematics Safe prime Safe… …   Wikipedia

  • Elliptic boundary value problem — In mathematics, an elliptic boundary value problem is a special kind of boundary value problem which can be thought of as the stable state of an evolution problem. For example, the Dirichlet problem for the Laplacian gives the eventual… …   Wikipedia

  • Hessian Affine region detector — The Hessian Affine region detector is a feature detector used in the fields of computer vision and image analysis. Like other feature detectors, the Hessian Affine detector is typically used as a preprocessing step to algorithms that rely on… …   Wikipedia

  • Lagrangian — This article is about Lagrange mechanics. For other uses, see Lagrangian (disambiguation). The Lagrangian, L, of a dynamical system is a function that summarizes the dynamics of the system. It is named after Joseph Louis Lagrange. The concept of… …   Wikipedia

Share the article and excerpts

Direct link
Do a right-click on the link above
and select “Copy Link”