Derivative (generalizations)

Derivative (generalizations)

Derivative is a fundamental construction of differential calculus and admits many possible generalizations within the fields of mathematical analysis, combinatorics, algebra, and geometry.

Derivatives in analysis

In real, complex, and functional analysis, derivatives are generalized to functions of several real or complex variables and functions between topological vector spaces. An important case is the variational derivative in the calculus of variations. Repeated application of differentiation leads to derivatives of higher order and differential operators.

Multivariable calculus

The derivative is often met for the first time as an operation on a single real function of a single real variable. One of the simplest settings for generalizations is to vector valued functions of several variables (most often the domain forms a vector space as well). This is the field of multivariable calculus.

In one-variable calculus, we say that a function is differentiable at a point "x" if the limit:lim_{h o 0}frac{f(x+h) - f(x)}{h}exists, its value is then the derivative ƒ'("x"). A function is differentiable on an interval if it is differentiable at every point within the interval.

We can generalize to functions mapping R"m" to R"n" as follows: ƒ is differentiable at "x" if there exists a linear operator "A"("x") (depending on "x") such that:lim_ = 0.Note that, in general, we concern ourselves mostly with functions being differentiable in some open neighbourhood of x rather than at individual points, as not doing so tends to lead to many pathological counterexamples.

An "m" by "n" matrix, of the linear operator "A"("x") is known as Jacobian matrix J"x"(ƒ) of the mapping ƒ at point "x". Each entry of this matrix represents a partial derivative, specifying the rate of change of one range coordinate with respect to a change in a domain coordinate. Of course, the Jacobianmatrix of the composition "g°f" is a product of corresponding Jacobian matrices: J"x"("g°f") =Jƒ("x")("g")J"x"(ƒ).

For real valued functions from R"n" to R (scalar fields), the total derivative can be interpreted as a vector field called the gradient. An intuitive interpretation of the gradient is that it points "up": in other words, it points in the direction of fastest increase of the function. It can be used to calculate directional derivatives of scalar functions or normal directions.

Several linear combinations of partial derivatives are especially useful in the context of differential equations defined by a vector valued function R"n" to R"n". The divergence gives a measure of how much "source" or "sink" near a point there is. It can be used to calculate flux by divergence theorem. The curl measures how much "rotation" a vector field has near a point.

For vector-valued functions from R to R"n" (i.e., parametric curves), one can take the derivative of each component separately. The resulting derivative is another vector valued function. This is useful, for example, if the vector-valued function is the position vector of a particle through time, then the derivative is the velocity vector of the particle through time.

The convective derivative takes into account changes due to time dependence and motion through space along vector field.

Convex analysis

The subderivative and subgradient are generalizations of the derivative to convex functions.

Higher-order derivatives and differential operators

One can iterate the differentiation process, that is, apply derivatives more than once, obtaining derivatives of second and higher order. A more sophisticated idea is to combine several derivatives, possibly of different orders, in one algebraic expression, a differential operator. This is especially useful in considering ordinary linear differential equations with constant coefficients. For example, if "f"("x") is a twice differentiable function of one variable, the differential equation

: f"+2f'-3f=4x-1,

may be rewritten in the form

: L(f)=4x-1,,    where    L=frac{d^2}{dx^2}+2frac{d}{dx}-3

is a "second order linear constant coefficient differential operator" acting on functions of "x". The key idea here is that we consider a particular linear combination of zeroth, first and second order derivatives "all at once". This allows us to think of the set of solutions of this differential equation as a "generalized antiderivative" of its right hand side 4"x" − 1, by analogy with ordinary integration, and formally write

: f(x)=L^{-1}(4x-1).,

Higher derivatives can also be defined for functions of several variables, studied in in multivariable calculus. In this case, instead of repeatedly applying the derivative, one repeatedly applies partial derivatives with respect to different variables. For example, the second order partial derivatives of a scalar function of "n" variables can be organized into an "n" by "n" matrix, the Hessian matrix. One of the subtle points is that the higher derivatives are not intrinsically defined, and depend on the choice of the coordinates in a complicated fashion (in particular, the Hessian matrix of a function is not a tensor). Nevertheless, higher derivatives have important applications to analysis of local extrema of a function at its critical points. For an advanced application of this analysis to topology of manifolds, see Morse theory.

As in the case of functions of one variable, we can combine first and higher order partial derivatives to arrive at a notion of a partial differential operator. Some of these operators are so important that they have their own names:

*The Laplace operator or Laplacian on R3 is a second-order partial differential operator "Δ" given by the divergence of the gradient of a scalar function of three variables, or explicitly as

:: Delta=frac{partial^2}{partial x^2}+frac{partial^2}{partial y^2}+frac{partial^2}{partial z^2}. Analogous operators can be defined for functions of any number of variables.

*The d'Alembertian or wave operator is similar to the Laplacian, but acts on functions of four variables. Its definition uses the indefinite metric of Minkowski space, instead of the Euclidean dot product of R"3":

:: square=frac{partial^2}{partial x^2}+frac{partial^2}{partial y^2}+frac{partial^2}{partial z^2}-frac{1}{c^2}frac{partial^2}{partial t^2}.

Analysis on fractals

Laplacians and differential equations can be defined on fractals.

Fractional derivatives

In addition to "n"-th derivatives for any natural number "n", there are various ways to define derivatives of fractional or negative orders, which are studied in fractional calculus. The -1 order derivative corresponds to the integral, whence the term differintegral.

Complex analysis

In complex analysis, the central objects of study are holomorphic functions, which are complex-valued functions on the complex numbers satisfying a suitably extended definition of differentiability.

The Schwarzian derivative describes how a complex function is approximated by a fractional-linear map, in much the same way that a normal derivative describes how a function is approximated by a linear map.

Functional analysis

In functional analysis, the functional derivative defines the derivative with respect to a function of a functional on a space of functions. This is an extension of the directional derivative to an infinite dimensional vector space.

The Fréchet derivative allows the extension of the directional derivative to a general Banach space. The Gâteaux derivative extends the concept to locally convex topological vector spaces. Fréchet differentiability is a strictly stronger condition than Gâteaux differentiability, even in finite dimensions. Between the two extremes is the quasi-derivative.

In measure theory, the Radon-Nikodym derivative generalizes the Jacobian, used for changing variables, to measures. It expresses one measure μ in terms of another measure ν (under certain conditions).

In the theory of abstract Wiener spaces, the "H"-derivative defines a derivative in certain directions corresponding to the Cameron-Martin Hilbert space.

The derivative also admits a generalization to the space of distributions on a space of functions using integration by parts against a suitably well-behaved subspace.

On a function space, the linear operator which assigns to each function its derivative is an example of a differential operator. General differential operators include higher order derivatives. By means of the Fourier transform, pseudo-differential operators can be defined which allow for fractional calculus.

Difference operator, q-analogues and time scales

* The q-derivative of a function is defined by the formula

: D_q f(x)=frac{f(qx)-f(x)}{(q-1)x}.

If "f" is a differentiable function of "x" then in the limit as "q" → 1 we obtain the ordinary derivative, thus the "q"-derivative may be viewed as its q-deformation. A large body of results from ordinary differential calculus, such as binomial formula and Taylor expansion, have natural "q"-analogues that were discovered in the 19th century, but remained relatively obscure for a big part of the 20th century, outside of the theory of special functions. The progress of combinatorics and the discovery of quantum groups have changed the situation dramatically, and the popularity of "q"-analogues is on the rise.

* The difference operator of difference equations is another discrete analog of the standard derivative.:Delta f(x)=f(x+1)-f(x),

* The q-derivative, the difference operator and the standard derivative can all be viewed as the same thing on different time scales.

Derivatives in algebra

In algebra, generalizations of the derivative can be obtained by imposing the Leibnitz rule of differentiation in an algebraic structure, such as a ring or a Lie algebra.

Derivations

A derivation is a linear map on a ring or algebra which satisfies the Leibnitz law (the product rule). Higher derivatives and algebraic differential operators can also be defined. They are studied in a purely algebraic setting in differential Galois theory and the theory of D-modules, but also turn up in many other areas, where they often agree with less algebraic definitions of derivatives.

For example, the formal derivative of a polynomial over a commutative ring "R" is defined by:(a_dx^d + a_{d-1}x^{d-1} + cdots+a_1x+a_0)' = da_dx^{d-1}+(d-1)a_{d-1}x^{d-2} + cdots+a_1.The mapping fmapsto f' is then a derivation on the polynomial ring "R" ["X"] . This definition can be extended to rational functions as well.

The notion of derivation applies to noncommutative as well as commutative rings, and even to non-associative algebraic structures, such as Lie algebras.

Also see Pincherle derivative.

Commutative algebra

In commutative algebra, Kähler differentials are universal derivations of a commutative ring or module. They can be used to define an analogue of exterior derivative from differential geometry that applies to arbitrary algebraic varieties, instead of just smooth manifolds.

Number theory

In p-adic analysis, the usual definition of derivative is not quite strong enough, and one requires strict differentiability instead.

Also see arithmetic derivative.

Set theory and logic

The ideas of zero, addition, multiplication and exponentiation known from the area of arithmetic have analogies in set theory, category theory and type theory. For example, there follows a small list of set theoretical ones:
*The empty set::varnothing
*The cartesian product of the sets A and B::A imes B
*the set of n-tuples of elements of the set A::A^n;forall ninmathbb N
*the set of functions from A to B ::B^A, (sometimes written as A o B or mathrm{Hom}(A,B),)Also::A + B,can be defined for sets as a fruitful concept. It something similar to the disjoint union of sets, but it uses labels to achieve a partition-like construct [More precisely, it is a two-arguments case of the more general construct::sum_Lambda vec A = left{leftlanglelambda, a ight angle in Lambda imes igcupmathcal A mid lambdainLambda land a in vec A_lambda ight} where vec A : Lambda o mathcal A] .There are analogous constructs for types, too (see also typeful functional programming languages). Now let us see parametric types, e.g.::F(X) = X^3,Thus, we can write “polynomials” for types. Let us define the derivative here as we define it for polynomials over a ring::F^prime(X) = X^2 + X^2 + X^2The given expession can represent a (homogenous) triple “with a hole”.

There are also more interesting constructs, than such polynomial ones. This notion of “derivative” can be extended also to them.This concept has practical applications (in functional programming); for example, see Zipper (data structure).

Derivatives in geometry

Main types of derivatives in geometry are Lie derivatives along a vector field, exterior differential, and covariant derivatives.

Differential topology

In differential topology, a vector field may be defined as a derivation on the ring of smooth functions on a manifold, and a tangent vector may be defined as a derivation at a point. This allows the abstraction of the notion of a directional derivative of a scalar function to general manifolds. For manifolds that are subsets of R"n", this tangent vector will agree with the directional derivative defined above.

The differential or pushforward of a map between manifolds is the induced map between tangent spaces of those maps. It abstracts the Jacobian matrix.

On the exterior algebra of differential forms over a smooth manifold, the exterior derivative is the unique linear map which satisfies a graded version of the Leibniz law and squares to zero. It is a grade 1 derivation on the exterior algebra.

The Lie derivative is the rate of change of a vector or tensor field along the flow of another vector field. On vector fields, it is an example of a Lie bracket (vector fields form the Lie algebra of the diffeomorphism group of the manifold). It is a grade 0 derivation on the algebra.

Together with the interior product (a degree -1 derivation on the exterior algebra defined by contraction with a vector field), the exterior derivative and the Lie derivative form a Lie superalgebra.

Differential geometry

In differential geometry, the covariant derivative makes a choice for taking directional derivatives of vector fields along curves. This extends the directional derivative of scalar functions to sections of vector bundles or principal bundles. In Riemannian geometry, the existence of a metric chooses a unique preferred torsion-free covariant derivative, known as the Levi-Civita connection. See also gauge covariant derivative for a treatment oriented to physics.

The exterior covariant derivative extends the exterior derivative to vector valued forms.

Other generalizations

It may be possible to combine two or more of the above different notions of extension or abstraction of the original derivative. For example, in Finsler geometry, one studies spaces which look locally like Banach spaces. Thus one might want a derivative with some of the features of a functional derivative and the covariant derivative.

The study of stochastic processes requires a form of calculus known as the Malliavin calculus. One notion of derivative in this setting is the "H"-derivative of a function on an abstract Wiener space.

Also see arithmetic derivative.

Notes


Wikimedia Foundation. 2010.

Игры ⚽ Поможем написать курсовую

Look at other dictionaries:

  • Derivative — This article is an overview of the term as used in calculus. For a less technical overview of the subject, see Differential calculus. For other uses, see Derivative (disambiguation) …   Wikipedia

  • Generalizations of the derivative — The derivative is a fundamental construction of differential calculus and admits many possible generalizations within the fields of mathematical analysis, combinatorics, algebra, and geometry. Contents 1 Derivatives in analysis 1.1 Multivariable… …   Wikipedia

  • Dini derivative — In mathematics and, specifically, real analysis, the Dini derivatives (or Dini derivates) are a class of generalizations of the derivative. The upper Dini derivative, which is also called an upper right hand derivative,[1] of a continuous… …   Wikipedia

  • Directional derivative — In mathematics, the directional derivative of a multivariate differentiable function along a given vector V at a given point P intuitively represents the instantaneous rate of change of the function, moving through P in the direction of V. It… …   Wikipedia

  • Material derivative — The material derivative[1][2] is a derivative taken along a path moving with velocity v, and is often used in fluid mechanics and classical mechanics. It describes the time rate of change of some quantity (such as heat or momentum) by following… …   Wikipedia

  • Fréchet derivative — In mathematics, the Fréchet derivative is a derivative defined on Banach spaces. Named after Maurice Fréchet, it is commonly used to formalize the concept of the functional derivative used widely in mathematical analysis, especially functional… …   Wikipedia

  • Gâteaux derivative — In mathematics, the Gâteaux differential is a generalisation of the concept of directional derivative in differential calculus. Named after René Gâteaux, a French mathematician who died young in World War I, it is defined for functions between… …   Wikipedia

  • Q-derivative — In mathematics, in the area of combinatorics, the q derivative is a q analog of the ordinary derivative. DefinitionThe q derivative of a function f ( x ) is defined as:left(frac{d}{dx} ight) q f(x)=frac{f(qx) f(x)}{qx x}It is also often written… …   Wikipedia

  • Parametric derivative — In calculus, a parametric derivative is a derivative that is taken when both the x and y variables (traditionally independent and dependent, respectively) depend on an independent third variable t , usually thought of as time .For example,… …   Wikipedia

  • Lie derivative — In mathematics, the Lie derivative, named after Sophus Lie by Władysław Ślebodziński, evaluates the change of one vector field along the flow of another vector field.The Lie derivative is a derivation on the algebra of tensor fields over a… …   Wikipedia

Share the article and excerpts

Direct link
Do a right-click on the link above
and select “Copy Link”