- Admissible decision rule
In classical (
frequentist )decision theory , an admissible decision rule is a rule for making a decision that is "better" than any other rule that may compete with it, in a specific sense defined below. Generally speaking, in most decision problems the set of admissible rules is large, even infinite, but as will be seen there are good reasons to favor admissible rules.Definition
Define sets , and , where are the states of nature, the possible observations and the actions that may be taken. A "decision rule" is a function , "i.e.," upon observing , we choose to take action .
In addition, we define a "loss function" , where is the set of real numbers, which measures the loss we incur by taking action when the true state of nature is . Usually we will take this action after observing data , so that the loss will be .
It is possible to recast the theory in terms of a "utility function", the negative of the loss. However, admissibility is usually defined in terms of a loss function, and we shall follow this convention.
Let have
cumulative distribution function . Define the "risk function " as the expectation:
A decision rule "dominates" a decision rule if and only if for all , "and" the inequality is strict for some .
A decision rule is "admissible" if and only if no other rule dominates it; otherwise it is "inadmissible". An admissible rule should be preferred over an inadmissible rule since for any inadmissible rule there is an admissible rule that performs at least as well for all states of nature and better for some.
Admissible rules and Bayes rules
Bayes rules
Let be a probability distribution on the states of nature. From a
Bayesian point of view, we would regard it as a "prior distribution". That is, it is our believed probability distribution on the states of nature, prior to observing data. For a frequentist, it is merely a function on with no such special interpretation. The "Bayes risk" of the decision rule with respect to is the expectation:
If the Bayes risk is finite, we can minimize with respect to to obtain , a "Bayes rule" with respect to . There may be more than one Bayes rule. If the Bayes risk is infinite, then no Bayes rule is defined.
Admissibility of Bayes rules
In the Bayesian approach to decision theory, is considered "fixed". Instead of averaging over as in the frequentist approach, the Bayesian would average over . Thus, we would be interested in computing for our observed the "expected loss"
:
Since is considered fixed and known, we can choose to minimize the expected loss for any ; by varying over its range, we can define a function , which is known as a "generalized Bayes rule". A generalized Bayes rule will be the same as some Bayes rule (relative to ), provided that the Bayes risk is finite. Since more than one decision rule may minimize the expected loss, there may not be a unique generalized Bayes rule.
According to the
complete class theorems , under mild conditions every admissible rule is a (generalized) Bayes rule (with respect to some, possibly improper, prior). Thus, infrequentist decision theory it is sufficient to consider only (generalized) Bayes rules.While Bayes rules with respect to proper priors are virtually always admissible, generalized Bayes rules corresponding to improper priors need not yield admissible procedures.
Stein's example is one such famous situation.References
* James O. Berger "Statistical Decision Theory and Bayesian Analysis". Second Edition. Springer-Verlag, 1980, 1985. ISBN 0-387-96098-8.
* Morris De Groot "Optimal Statistical Decisions". Wiley Classics Library. 2004. (Originally published 1970.) ISBN 0-471-68029-X.
* Christian P. Robert "The Bayesian Choice". Springer-Verlag 1994. ISBN 3-540-94296-3.
Wikimedia Foundation. 2010.