Background reading: PAPER
See general announcement here.
Background to the Discussion: Question: How did I get involved in disproving Birnbaum’s result in 2006?
Answer: Appealing to something called the “weak conditionality principle (WCP)” arose in avoiding a classic problem (arising from mixture tests) described by David Cox (1958), as discussed in our joint paper:
Cox D. R. and Mayo. D. (2010). “Objectivity and Conditionality in Frequentist Inference” in Error and Inference: Recent Exchanges on Experimental Reasoning, Reliability and the Objectivity and Rationality of Science (D Mayo & A. Spanos eds.), CUP 276-304.
However, Birnbaum had argued (1962) that the WCP (together with other uncontroversial principles) entailed the “strong-likelihood principle” (SLP), from which it followed that frequentist sampling distributions were irrelevant for inference (once the data are available)! Moreover, Birnbaum’s result is given as uncontroversial in many textbooks:
It is not uncommon to see statistics texts argue that in frequentist theory one is faced with the following dilemma: either to deny the appropriateness of conditioning on the precision of the tool chosen by the toss of a coin, or else to embrace the strong likelihood principle, which entails that frequentist sampling distributions are irrelevant to inference once the data are obtained. This is a false dilemma. . . . The “dilemma” argument is therefore an illusion. (Cox and Mayo 2010, 298).
This led to Mayo 2010, and now to the discussion for my upcoming seminar. Other links to the Strong Likelihood Principle SLP: Cox & Mayo 2011 (appendix); Birnbaum 1970 Letter to Nature Editor; the current “U-Phil“; and “Breaking through the Breakthrough” posts on Dec 6 & Dec 7, 2011; and by further searching this blog*.
On the two subsequent seminars: see here**.
- 5 Dec (12 noon- 2 p.m.): Sir David Cox
- 12 Dec (10 a.m. -12 noon): Dr. Stephen Senn
- For updates, details, and associated readings: please check the LSE Ph500 page on my blog, original announcement, or write to me.
Blurb for the series of 5 seminars: “Contemporary problems in PhilStat”: Debates over the philosophical foundations of statistical science have a long and fascinating history marked by deep and passionate controversies that intertwine with fundamental notions of the nature of statistical inference and the role of probabilistic concepts in inductive learning. Progress in resolving decades-old controversies which still shake the foundations of statistics, demands both philosophical and technical acumen, but gaining entry into the current state of play requires a road map that zeroes in on core themes and current standpoints. While the seminar will attempt to minimize technical details, it will be important to clarify key notions to fully contribute to the debates. Relevance for general philosophical problems will be emphasized. Because the contexts in which statistical methods are most needed are ones that compel us to be most aware of strategies scientists use to cope with threats to reliability, considering the nature of statistical method in the collection, modeling, and analysis of data is an effective way to articulate and warrant general principles of evidence and inference.
Room 2.06 Lakatos Building; Centre for Philosophy of Natural and Social Science London School of Economics Houghton Street London WC2A 2AE Administrator: T. R. Chivers@lse.ac.uk
**I expect to be joined by Dr. C. Hennig on at least one of the days.
Filed under: Announcement, Likelihood Principle, Statistics
Please comment on the article here: Error Statistics Philosophy » Statistics