- Total derivative
In the mathematical field of
differential calculus , the term total derivative has a number of closely related meanings.* The total derivative of a function, "f", of several variables, e.g., "t","x","y", etc., with respect to one of its input variables, e.g., "t", is different from the
partial derivative . Calculation of the total derivative of "f" with respect to "t" does not assume that the other arguments are constant while "t" varies; instead, it allows the other arguments to depend on "t". The total derivative adds in these "indirect dependencies" to find the overall dependency of "f" on "t". For example, the total derivative of "f"("t","x","y") with respect to "t" is:: :Consider multiplying both sides of the equation by the differential . The result will be the differential change in the function . Because depends on , some of that change will be due to thepartial derivative of with respect to . However, some of that change will also be due to the partial derivatives of with respect to the variables and . So, the differential is applied to the total derivatives of and to find differentials and , which can then be used to find the contribution to .* It refers to a
differential operator such as::
:which computes the total derivative of a function (with respect to "x" in this case).
* It refers to the (total) differential d"f" of a function, either in the traditional language of
infinitesimal s or the modern language ofdifferential form s.* A differential of the form:::is called a total differential or an
exact differential if it is the differential of a function. Again this can be interpreted infinitesimally, or by using differential forms and theexterior derivative .* It is another name for the derivative as a linear map, i.e., if "f" is a differentiable function from R"n" to R"m", then the (total) derivative (or differential) of "f" at "x"∈R"n" is the linear map from R"n" to R"m" whose matrix is the
Jacobian matrix of "f" at "x".* It is a synonym for the
gradient , which is essentially the derivative of a function from R"n" to R.Differentiation with indirect dependencies
Suppose that "f" is a function of three variables "x", "y", and "z". Normally these variables are assumed to be independent. However, in some situations they may be dependent on each other. For example, "y" and "z" could be functions of "x". In this case the
partial derivative of "f" with respect to "x" does not give the true rate of change of "f" with respect to "x", because it does not take into account the dependency of "y" and "z" on "x". The total derivative is a way of taking such dependencies into account.For example, suppose "f" ("x", "y", "z") = "xyz". The rate of change of "f" with respect to "x" is normally determined by taking the partial derivative of "f" with respect to "x", which is, in this case, ∂"f" / ∂"x" = "yz". However, if "y" and "z" are not truly independent but depend on "x" as well this does not give the right answer. For a really simple example, suppose "y" and "z" are both equal to "x". Then "f"="xyz"="x"3 and so the (total) derivative of "f" with respect to "x" is d"f" / d"x" = 3"x"2. Notice that this is not equal to the partial derivative "yz"="x"2.
While one can often perform substitutions to eliminate indirect dependencies, the chain rule provides for a more efficient and general technique. Suppose "M"("t", "p"1, ..., "pn") is a function of time "t" and "n" variables which themselves depend on time. Then, the total time derivative of "M" is
:
The
chain rule for differentiating a function of several variables implies that:
This expression is often used in
physics for agauge transformation of theLagrangian , as two Lagrangians that differ only by the total time derivative of a function of time and the "n"generalized coordinates lead to the same equations of motion. The operator in brackets (in the final expression) is also called the total derivative operator (with respect to "t").For example, the total derivative of "f"("x"("t"), "y"("t")) is
:
Here there is no ∂"f" / ∂"t" term since "f" itself does not depend on the independent variable "t" directly.
The total derivative via differentials
Differentials provide a simple way to understand the total derivative. For instance, suppose is a function of time "t" and "n" variables as in the previous section. Then, the differential of "M" is
:
This expression is often interpreted "heuristically" as a relation between
infinitesimal s. However, if the variables "t" and "p""j" are interpreted as functions, and is interpreted to mean the composite of "M" with these functions, then the above expression makes perfect sense as an equality ofdifferential 1-form s, and is immediate from thechain rule for theexterior derivative . The advantage of this point of view is that it takes into account arbitrary dependencies between the variables. For example, if then . In particular, if the variables "p""j" are all functions of "t", as in the previous section, then:
The total derivative as a linear map
Let be an
open subset . Then a function is said to be (totally) differentiable at a point , if there exists a linear map (also denoted D"p""f" or D"f"(p)) such that:
The linear map is called the (total) derivative or (total) differential of at . A function is (totally) differentiable if its total derivative exists at every point in its domain.
Note that "f" is differentiable if and only if each of its components is differentiable. For this it is necessary, but not sufficient, that the partial derivatives of each function "f""j" exist. However, if these partial derivatives exist and are continuous, then "f" is differentiable and its differential at any point is the linear map determined by the
Jacobian matrix of partial derivatives at that point.Total differential equation
A "total differential equation" is a
differential equation expressed in terms of total derivatives. Since theexterior derivative is anatural operator , in a sense that can be given a technical meaning, such equations are intrinsic and "geometric".Application of the total differential to error estimation
In measurement, the total differential is used in estimating the error Δ"f" of a function "f" based on the errors Δ"x", Δ"y", ... of the parameters "x, y, ...". Assuming that
:Δ"f"("x") = "f"'("x") × Δ"x"
and that all variables are independent, then for all variables,
:Δ"f" = "f""x" Δ"x"+ "f""y" Δ"y" +...
This is because the derivative "f"x with respect to the particular parameter "x" gives the sensitivity of the function "f" to a change in "x", in particular the error Δ"x". As they are assumed to be independent, the analysis describes the worst-case scenario. The absolute values of the component errors are used, because after simple computation, the derivative may have a negative sign. From this principle the error rules of summation, multiplication etc. are derived, e.g.:
:Let f("a", "b") = "a" × "b";
:Δ"f" = "f""a"Δ"a" + "f""b"Δ"b"; evaluating the derivatives
:Δ"f" = "b"Δ"a" + "a"Δ"b"; dividing by "f", which is "a" × "b"
:Δ"f"/"f" = Δ"a"/"a" + Δ"b"/"b"
That is to say, in multiplication, the total
relative error is the sum of the relative errors of the parameters.Bibliography
* A. D. Polyanin and V. F. Zaitsev, "Handbook of Exact Solutions for Ordinary Differential Equations (2nd edition)", Chapman & Hall/CRC Press, Boca Raton, 2003. ISBN 1-58488-297-2
* From thesaurus.maths.org [http://thesaurus.maths.org/mmkb/entry.html;jsessionid=EC2A4288632FF1D59B1207BA04FCC65B?action=entryByConcept&id=952&langcode=en total derivative]
Wikimedia Foundation. 2010.