- Henry E. Kyburg, Jr.
Henry E. Kyburg, Jr. (1928 – 2007) was Gideon Burbank Professor of Moral Philosophy and Professor of Computer Science at the
University of Rochester , New York, and Pace Eminent Scholar at The Institute for Human and Machine Cognition, Pensacola, Florida. His first faculty posts were atRockefeller Institute ,University of Denver ,Wesleyan College , andWayne State University .Kyburg worked in probability and logic, and is known for his
Lottery Paradox (1961). Kyburg also edited "Studies in Subjective Probability" (1964) with Howard Smokler. Because of this collection's relation toBayesian probability , Kyburg is often misunderstood to be a Bayesian. His own theory of probability is outlined in "Logical Foundations of Statistical Inference" (1974), a theory that first found form in his 1961 book "Probability and the Logic of Rational Belief" (in turn, a work closely related to his doctoral thesis). Kyburg describes his theory as Keynesian and Fisherian (seeJohn Maynard Keynes andRonald Fisher ), a delivery on the promises ofRudolph Carnap andHans Reichenbach for a logical probability based on reference classes, a reaction to Neyman-Pearson statistics (seeJerzy Neyman andKarl Pearson ), and neutral with respect to Bayesian confirmational conditionalization. On the latter subject, Kyburg had extended discussion in the literature with lifelong friend and colleagueIsaac Levi .Kyburg's later major works include "Epistemology and Inference" (1983), a collection of essays; "Theory and Measurement" (1984), a response to Krantz-Luce-Suppes-Tversky's "Foundations of Measurement"; and "Science and Reason" (1990), which seeks to respond to
Karl Popper 's andBruno de Finetti 's concerns that empirical data could not confirm a universally quantified scientific axiom (e.g., "F = ma").Kyburg was Fellow of the
American Association for the Advancement of Science (1982), Fellow of the American Academy of Arts and Science (1995), Fellow of theAmerican Association for Artificial Intelligence (2002), and recipient of the Butler Medal for Philosophy in Silver fromColumbia University , where he received his PhD withErnest Nagel as his advisor. Kyburg was also a graduate ofYale University and aGuggenheim Fellow .Kyburg owned a farm in
Lyons, New York where he raisedAngus cattle and promotedwind turbine systems for energy-independent farmers.Philosophical Relatives
Several full professors of philosophy today were once undergraduates of Henry Kyburg, including
Daniel Dennett ,Robert Stalnaker , Rich Thomason, and Teddy Seidenfeld. Kyburg's own line of philosophical descent was:Gottfried Leibniz >>Christian_Wolff_(philosopher) >> [http://de.wikipedia.org/wiki/Martin_Knutzen Martin Knutzen] >>Immanuel Kant >>Karl Leonhard Reinhold >>Friedrich Adolf Trendelenburg >>George Sylvester Morris >>Josiah Royce /William James /Charles Peirce >>Morris Cohen >>Ernest Nagel >> Henry Kyburg.Theory of Probability
Several ideas distinguish Kyburg's "Kyburgian" or "epistemological" interpretation of probability:
*Probability is measured by an interval (some mistake this as an affinity toDempster-Shafer theory, but Kyburg firmly rejects their rule of combination; his work remained closer to confidence intervals, and was often interpreted by Bayesians as a commitment to a set of distributions, which Kyburg did not repudiate)
*All probability statements can be traced to direct inference of frequency in a reference class (there can be Bayes-rule calculations upon direct-inference conclusions, but there is nothing like a prior distribution in Kyburg's theory)
*The reference class is the most specific class with suitable frequency knowledge (this is the Reichenbach rule, which Kyburg made precise; his framework was later reinterpreted as adefeasible reasoning system by John L. Pollock, but Kyburg never intended the calculation of objective probabilities to be shortcut bybounded rationality due to computational imperfection)
*All probability inferences are based on knowledge of frequencies and properties, not ignorance of frequencies; however, randomness is essentially the lack of knowledge of bias (Kyburg especially rejects the maximum entropist methods ofHarold Jeffreys ,E.T. Jaynes and other uses of thePrinciple of Indifference here; and Kyburg disagrees here withIsaac Levi who believes that chance must be positively asserted upon knowledge of relevant physical symmetries)
*There is no disagreement over the probability once there is agreement on the relevant knowledge; this is an objectivism relativized to an evidential state (i.e., relativized to a set of observed frequencies of properties in a class, and a set of asserted properties of events)Example: Suppose a "corpus of Knowledge" at a "level of acceptance." Contained in this corpus are statements, "e is a T1" and "e is a T2". The observed "frequency of P among T1" is .9. The observed "frequency of P among T2" is .4. What is the "probability that e is a P"? Here, there are two "conflicting reference classes," so the probability is either " [0, 1] ", or some interval combining the .4 and .9, which sometimes is just " [.4, .9] " (but often a different conclusion will be warranted). Adding the knowledge "All T1's are T2's" now makes T1 the "most specific relevant reference class" and a "dominator" of all "interfering reference classes." With this universal statement of class inclusion, the probability is .9, by "direct inference from T1". Kyburg's rules apply to conflict and subsumption in complicated partial orders.
Theory of Acceptance, Confirmation, Measurement, and Scientific Method
Kyburg's inferences are always relativized to a "level of acceptance" that defines a corpus of "morally certain" statements. This is like a level of confidence, except that Neyman-Pearson theory is prohibited from retrospective calculation and post-observational acceptance, while Kyburg's epistemological interpretation of probability licenses both. At a level of acceptance, any statement that is more probable than the level of acceptance can be adopted as if it were a certainty. This can create logical inconsistency, which led Kyburg to the
lottery paradox and which is solved by seeking unanimity among maximal consistent subsets of a corpus.In the example above, the calculation that "e is a P" with probability .9 permits the "acceptance" of the statement "e is a P" categorically, at any level of acceptance lower than .9 (assuming also that the calculation was performed at an acceptance level above .9). The interesting tension is that very high levels of acceptance contain few evidentiary statements. They do not even include "raw observations of the senses" if those senses have often been fooled in the past. Similarly, if a measurement device reports within an interval of error at a rate of .95, then no measurable statements are acceptable at a level above .95, unless the interval of error is widened. Meanwhile, at lower levels of acceptance, so many contradictory statements are acceptable that nothing useful can be derived in all maximal consistent subsets.
Kyburg's treatment of universally quantified sentences is to add them to the "Ur-corpus" or "
meaning postulate s" of the language. There, a statement like "F = ma" or "preference is transitive" provides additional inferences at all acceptance levels. In some cases, the addition of an axiom produces predictions that are not refuted by experience. These are the adoptable theoretical postulates (and they must still be ordered by some kind of simplicity). In other cases, the theoretical postulate is in conflict with the evidence and measurement-based observations, so the postulate must be rejected. In this way, Kyburg provides a probability-mediated model ofpredictive power , scientific theory-formation, the "Web of Belief", and linguistic variation. The theory of acceptance mediates the tension between linguistic categorical assertion and probability-based epistemology.References
* [http://www.rochester.edu/news/show.php?id=3055 Official Obituary]
Wikimedia Foundation. 2010.