# Score (statistics)

Score (statistics)

In statistics, the score or score function is the partial derivative, with respect to some parameter $heta$, of the logarithm (commonly the natural logarithm) of the likelihood function.If the observation is $X$ and its likelihood is $L\left( heta;X\right)$, then the score $V$ can be found through the chain rule:

:$V=frac\left\{partial\right\}\left\{partial heta\right\} log L\left( heta;X\right)=frac\left\{1\right\}\left\{L\left( heta;X\right)\right\} frac\left\{partial L\left( heta;X\right)\right\}\left\{partial heta\right\}.$

Note that $V$ is a function of $heta$ and the observation $X$, so that, in general, it is not a statistic. Note also that $V$ indicates the "sensitivity" of $L\left( heta;X\right)$ (its variation normalized by its value).

Mean

The expected value of $V$ with respect to the observation $x$, given $heta$, written $mathbb\left\{E\right\}\left(V| heta\right)$, is zero.To see this, rewrite the definition of expectation, using the fact that the probability mass function is just $L\left( heta; x\right)$, which is conventionally denoted by $f\left(x; heta\right)$ (in which the dependence on $x$ is more explicit). The corresponding cumulative distribution function is denoted as $F\left(x; heta\right)$. With this change of notation and writing $f\text{'}_\left\{ heta\right\}\left(x; heta\right)$ for the partial derivative with respect to $heta$,

:$mathbb\left\{E\right\}\left(V| heta\right)=int_\left\{ \left[0,1\right] \right\}frac\left\{f\text{'}_\left\{ heta\right\}\left(x; heta\right)\right\}\left\{f\left(x; heta\right)\right\}dF\left(x; heta\right)=int_X frac\left\{f\text{'}_\left\{ heta\right\}\left(x; heta\right)\right\}\left\{f\left(x; heta\right)\right\} f\left(x; heta\right)dx=int_X frac\left\{partial f\left(x; heta\right)\right\}\left\{partial heta\right\} , dx$

where the integral runs over the whole of the probability space of "X" and a prime denotes partial differentiation with respect to $heta$. If certain differentiability conditions are met, the integral may be rewritten as

:$frac\left\{partial\right\}\left\{partial heta\right\} int_X f\left(x; heta\right) , dx=frac\left\{partial\right\}\left\{partial heta\right\}1 = 0.$

It is worth restating the above result in words: the expected value of the score is zero.Thus, if one were to repeatedly sample from some distribution, and repeatedly calculate the score with the true $heta$, then the mean value of the scores would tend to zero as the number of repeat samples approached infinity.

Variance

The variance of the score is known as the Fisher information and is written $mathcal\left\{I\right\}\left( heta\right)$. Because the expectation of the score is zero, this may be written as

:$mathcal\left\{I\right\}\left( heta\right)=mathbb\left\{E\right\}left\left\{left. left \left[ frac\left\{partial\right\}\left\{partial heta\right\} log L\left( heta;X\right) ight\right] ^2 ight| heta ight\right\}.$

Note that the Fisher information, as defined above, is not a function of any particular observation, as the random variable $X$ has been averaged out.This concept of information is useful when comparing two methods of observation of some random process.

Example

Consider a Bernoulli process, with "A" successes and "B" failures; the probability of success is &theta;.

Then the likelihood "L" is

:$L\left( heta;A,B\right)=frac\left\{\left(A+B\right)!\right\}\left\{A!B!\right\} heta^A\left(1- heta\right)^B,$

so the score "V" is given by taking the partial derivative of the log likelihood function as follows:

:$V=frac\left\{partial\right\}\left\{partial heta\right\}logleft \left[L\left( heta;A,B\right) ight\right] =frac\left\{1\right\}\left\{L\right\}frac\left\{partial L\right\}\left\{partial heta\right\}.$

This is a standard calculus problem: "A" and "B" are treated as constants. Then

:$V=frac\left\{A\right\}\left\{ heta\right\}-frac\left\{B\right\}\left\{1- heta\right\}.$

So if the score is zero, &theta; = "A"/("A" + "B"). We can now verify that the expectation of the score is zero. Noting that the expectation of "A" is "n"&theta; and the expectation of "B" is "n"(1 − &theta;), we can see that the expectation of "V" is

:$E\left(V\right)= frac\left\{n heta\right\}\left\{ heta\right\} - frac\left\{n\left(1- heta\right)\right\}\left\{1- heta\right\}= n - n = 0.$

We can also check the variance of $V$. We know that "A" + "B" = "n" (so "B" = "n" - "A") and the variance of "A" is "n"&theta;(1 − &theta;) so the variance of "V" is

:$operatorname\left\{var\right\}\left(V\right)=operatorname\left\{var\right\}left\left(frac\left\{A\right\}\left\{ heta\right\}-frac\left\{n-A\right\}\left\{1- heta\right\} ight\right) =operatorname\left\{var\right\}left\left(Aleft\left(frac\left\{1\right\}\left\{ heta\right\}+frac\left\{1\right\}\left\{1- heta\right\} ight\right) ight\right)=left\left(frac\left\{1\right\}\left\{ heta\right\}+frac\left\{1\right\}\left\{1- heta\right\} ight\right)^2operatorname\left\{var\right\}\left(A\right)=frac\left\{n\right\}\left\{ heta\left(1- heta\right)\right\}.$

ee also

*Fisher information
*Information theory
*Score test
*Support curve

References

*cite book
last = Schervish
first = Mark J.
title = Theory of Statistics
publisher =Springer
date =1995
location =New York
pages = Section 2.3.1
isbn = 0387945466

Wikimedia Foundation. 2010.

### Look at other dictionaries:

• Score — may refer to:Numbers and statistics*Score (game), a number of points achieved in a sporting event or game. *a unit of twenty, as in Abraham Lincoln s Gettysburg Address. *Score (statistics), the derivative, with respect to some parameter θ, of… …   Wikipedia

• Score test — A score test is a statistical test of a simple null hypothesis that a parameter of interest heta isequal to some particular value heta 0. It is the most powerful test when the true value of heta is close to heta 0.ingle parameter testThe… …   Wikipedia

• Score function — The term score function may refer to: * Scoring rule, in decision theory, a measure of one s performance when making decisions under uncertainty * Score (statistics), the derivative of the log likelihood function with respect to the parameter …   Wikipedia

• statistics — /steuh tis tiks/, n. 1. (used with a sing. v.) the science that deals with the collection, classification, analysis, and interpretation of numerical facts or data, and that, by use of mathematical theories of probability, imposes order and… …   Universalium

• Score (album) — Infobox Album | Name = Score Type = live Artist = Dream Theater Released = August 29, 2006 Recorded = April 1, 2006 Radio City Music Hall, New York City, New York Genre = Progressive metal Length = 2:37:05 Label = Rhino Entertainment Producer =… …   Wikipedia

• score — An evaluation, usually expressed numerically, of status, achievement, or condition in a given set of circumstances. [M. E. scor, notch, tally] APACHE s. Acute physiology and chronic health evaluation. The most widely used method of assessing the… …   Medical dictionary

• List of statistics topics — Please add any Wikipedia articles related to statistics that are not already on this list.The Related changes link in the margin of this page (below search) leads to a list of the most recent changes to the articles listed below. To see the most… …   Wikipedia

• Baseball statistics — Statistics play an important role in summarizing baseball performance and evaluating players in the sport. Since the flow of baseball has natural breaks to it, the game lends itself to easy record keeping and statistics. This makes comparisons… …   Wikipedia

• European Union statistics — Statistics in the European Union are collected by Eurostat.Institutional statisticsArea and populationAs of 1 January 2006, the population of the EU was about 493 million people [ [http://epp.eurostat.ec.europa.eu/portal/page?… …   Wikipedia

• Basketball statistics — Statistics in basketball are kept to evaluate a player or a team s performance.Some statistics are * GP, GS: games played, games started * PTS: points * FGM, FGA, FG%: field goals made, attempted and percentage * FTM, FTA, FT%: free throws made,… …   Wikipedia