-
89Explaining the limits of Olsson's impossibility resultSouthern Journal of Philosophy 50 (1): 136-150. 2012.In his groundbreaking book, Against Coherence (2005), Erik Olsson presents an ingenious impossibility theorem that appears to show that there is no informative relationship between probabilistic measures of coherence and higher likelihood of truth. Although Olsson's result provides an important insight into probabilistic models of epistemological coherence, the scope of his negative result is more limited than generally appreciated. The key issue is the role conditional independence conditions p…Read more
-
182A Review of the Lottery ParadoxIn William Harper & Gregory Wheeler (eds.), Probability and Inference: Essays in Honour of Henry E. Kyburg, Jr, College Publications. 2007.Henry Kyburg’s lottery paradox (1961, p. 197) arises from considering a fair 1000 ticket lottery that has exactly one winning ticket. If this much is known about the execution of the lottery it is therefore rational to accept that one ticket will win. Suppose that an event is very likely if the probability of its occurring is greater than 0.99. On these grounds it is presumed rational to accept the proposition that ticket 1 of the lottery will not win. Since the lottery is fair, it is rational t…Read more
-
786Scoring Imprecise Credences: A Mildly Immodest ProposalPhilosophy and Phenomenological Research 92 (1): 55-78. 2016.Jim Joyce argues for two amendments to probabilism. The first is the doctrine that credences are rational, or not, in virtue of their accuracy or “closeness to the truth” (1998). The second is a shift from a numerically precise model of belief to an imprecise model represented by a set of probability functions (2010). We argue that both amendments cannot be satisfied simultaneously. To do so, we employ a (slightly-generalized) impossibility theorem of Seidenfeld, Schervish, and Kadane (2012), wh…Read more
-
166Models, Models, and ModelsMetaphilosophy 44 (3): 293-300. 2013.Michael Dummett famously maintained that analytic philosophy was simply philosophy that followed Frege in treating the philosophy of language as the basis for all other philosophy (1978, 441). But one important insight to emerge from computer science is how difficult it is to animate the linguistic artifacts that the analysis of thought produces. Yet, modeling the effects of thought requires a new skill that goes beyond analysis: procedural literacy. Some of the most promising research in philos…Read more
-
637Demystifying DilationErkenntnis 79 (6): 1305-1342. 2014.Dilation occurs when an interval probability estimate of some event E is properly included in the interval probability estimate of E conditional on every event F of some partition, which means that one’s initial estimate of E becomes less precise no matter how an experiment turns out. Critics maintain that dilation is a pathological feature of imprecise probability models, while others have thought the problem is with Bayesian updating. However, two points are often overlooked: (1) knowing that …Read more
-
104NO Revision and NO ContractionMinds and Machines 21 (3): 411-430. 2011.One goal of normative multi-agent system theory is to formulate principles for normative system change that maintain the rule-like structure of norms and preserve links between norms and individual agent obligations. A central question raised by this problem is whether there is a framework for norm change that is at once specific enough to capture this rule-like behavior of norms, yet general enough to support a full battery of norm and obligation change operators. In this paper we propose an an…Read more
-
123Epistemology and Artificial IntelligenceJournal of Applied Logic 2 (4): 469-93. 2004.In this essay we advance the view that analytical epistemology and artificial intelligence are complementary disciplines. Both fields study epistemic relations, but whereas artificial intelligence approaches this subject from the perspective of understanding formal and computational properties of frameworks purporting to model some epistemic relation or other, traditional epistemology approaches the subject from the perspective of understanding the properties of epistemic relations in terms of thei…Read more
-
65An implementation of statistical default logicIn Jose Alferes & Joao Leite (eds.), Logics in Artificial Intelligence (JELIA 2004), Springer. 2004.Statistical Default Logic (SDL) is an expansion of classical (i.e., Reiter) default logic that allows us to model common inference patterns found in standard inferential statistics, e.g., hypothesis testing and the estimation of a population‘s mean, variance and proportions. This paper presents an embedding of an important subset of SDL theories, called literal statistical default theories, into stable model semantics. The embedding is designed to compute the signature set of literals that uniqu…Read more
-
27Book review: Change, Choice and Inference: A Study of Belief and Revision and Nonmonotonic Reasoning (review)Philosophy of Science 72 (3): 498-503. 2005.
-
119Formal EpistemologyIn Andrew Cullison (ed.), A Companion to Epistemology, Continuum Press. 2010.Yet, in broader terms, formal epistemology is not merely a methodological tool for epistemologists, but a discipline in its own right. On this programmatic view, formal epistemology is an interdisciplinary research program that covers work by philosophers, mathematicians, computer scientists, statisticians, psychologists, operations researchers, and economists who aim to give mathematical and sometimes computational representations of, along with sound strategies for reasoning about, knowledge, …Read more
-
64New Challenges to Philosophy of Science (edited book)Springer Verlag. 2013.This fourth volume of the Programme “The Philosophy of Science in a European Perspective” deals with new challenges in this field. In this regard, it seeks to broaden the scope of the philosophy of science in two directions. On the one hand, ...
-
137Focused correlation and confirmationBritish Journal for the Philosophy of Science 60 (1): 79-100. 2009.This essay presents results about a deviation from independence measure called focused correlation . This measure explicates the formal relationship between probabilistic dependence of an evidence set and the incremental confirmation of a hypothesis, resolves a basic question underlying Peter Klein and Ted Warfield's ‘truth-conduciveness’ problem for Bayesian coherentism, and provides a qualified rebuttal to Erik Olsson's claim that there is no informative link between correlation and confirmati…Read more
-
118Causation, Association, and ConfirmationIn Stephan Hartmann, Marcel Weber, Wenceslao Gonzalez, Dennis Dieks & Thomas Uebe (eds.), Explanation, Prediction, and Confirmation, Springer. pp. 37--51. 2011.Many philosophers of science have argued that a set of evidence that is "coherent" confirms a hypothesis which explains such coherence. In this paper, we examine the relationships between probabilistic models of all three of these concepts: coherence, confirmation, and explanation. For coherence, we consider Shogenji's measure of association (deviation from independence). For confirmation, we consider several measures in the literature, and for explanation, we turn to Causal Bayes Nets and resor…Read more
-
150Objective Bayesian Calibration and the Problem of Non-convex EvidenceBritish Journal for the Philosophy of Science 63 (4): 841-850. 2012.Jon Williamson's Objective Bayesian Epistemology relies upon a calibration norm to constrain credal probability by both quantitative and qualitative evidence. One role of the calibration norm is to ensure that evidence works to constrain a convex set of probability functions. This essay brings into focus a problem for Williamson's theory when qualitative evidence specifies non-convex constraints.
-
129Dilation, Disintegrations, and Delayed DecisionsIn Thomas Augistin, Serena Dora, Enrique Miranda & Erik Quaeghebeur (eds.), Proceedings of the 9th International Symposium on Imprecise Probability: Theories and Applications (ISIPTA 2015), Aracne Editrice. 2015.Both dilation and non-conglomerability have been alleged to conflict with a fundamental principle of Bayesian methodology that we call \textit{Good's Principle}: one should always delay making a terminal decision between alternative courses of action if given the opportunity to first learn, at zero cost, the outcome of an experiment relevant to the decision. In particular, both dilation and non-conglomerability have been alleged to permit or even mandate choosing to make a terminal decision in …Read more
-
36On The Structure of Rational Acceptance: Comments on Hawthorne and BovensSynthese 144 (2): 287-304. 2005.The structural view of rational acceptance is a commitment to developing a logical calculus to express rationally accepted propositions sufficient to represent valid argument forms constructed from rationally accepted formulas. This essay argues for this project by observing that a satisfactory solution to the lottery paradox and the paradox of the preface calls for a theory that both (i) offers the facilities to represent accepting less than certain propositions within an interpreted artificial…Read more
-
44Applied Logic without PsychologismStudia Logica 88 (1): 137-156. 2008.Logic is a celebrated representation language because of its formal generality. But there are two senses in which a logic may be considered general, one that concerns a technical ability to discriminate between different types of individuals, and another that concerns constitutive norms for reasoning as such. This essay embraces the former, permutation-invariance conception of logic and rejects the latter, Fregean conception of logic. The question of how to apply logic under this pure invarianti…Read more
-
108Modeling of Phenomena and Dynamic Logic of PhenomenaJournal of Applied Non-Classical Logic 22 (1): 1-82. 2011.Modeling a complex phenomena such as the mind presents tremendous computational complexity challenges. Modeling field theory (MFT) addresses these challenges in a non-traditional way. The main idea behind MFT is to match levels of uncertainty of the model (also, a problem or some theory) with levels of uncertainty of the evaluation criterion used to identify that model. When a model becomes more certain, then the evaluation criterion is adjusted dynamically to match that change to the model. Th…Read more
-
673Is there a logic of information?Journal of Theoretical and Applied Artificial Intelligence 27 (1): 95-98. 2015.Information-based epistemology maintains that ‘being informed’ is an independent cognitive state that cannot be reduced to knowledge or to belief, and the modal logic KTB has been proposed as a model. But what distinguishes the KTB analysis of ‘being informed’, the Brouwersche schema (B), is precisely its downfall, for no logic of information should include (B) and, more generally, no epistemic logic should include (B), either.
-
115Probabilistic Logics and Probabilistic NetworksSynthese Library. 2010.Additionally, the text shows how to develop computationally feasible methods to mesh with this framework.
-
70Humanists and ScientistsThe Reasoner 1 (1). 2007.C.P. Snow observed that universities are largely made up of two broad types of people, literary intellectuals and scientists, yet a typical individual of each type is barely able, if able at all, to communicate with his counterpart. Snow's observation, popularized in his 1959 lecture Two Cultures and the Scientific Revolution (reissued by Cambridge 1993), goes some way to explaining the two distinct cultures one hears referred to as "the humanities" and "the sciences." Snow's lecture is a study …Read more
-
163Why the Hardest Logic Puzzle Ever Cannot Be Solved in Less than Three QuestionsJournal of Philosophical Logic 41 (2): 493-503. 2012.Rabern and Rabern (Analysis 68:105–112 2 ) and Uzquiano (Analysis 70:39–44 4 ) have each presented increasingly harder versions of ‘the hardest logic puzzle ever’ (Boolos The Harvard Review of Philosophy 6:62–65 1 ), and each has provided a two-question solution to his predecessor’s puzzle. But Uzquiano’s puzzle is different from the original and different from Rabern and Rabern’s in at least one important respect: it cannot be solved in less than three questions. In this paper we solve Uzquiano…Read more
-
200Focused Correlation, Confirmation, and the Jigsaw Puzzle of Variable EvidencePhilosophy of Science 78 (3): 376-92. 2011.Focused correlation compares the degree of association within an evidence set to the degree of association in that evidence set given that some hypothesis is true. A difference between the confirmation lent to a hypothesis by one evidence set and the confirmation lent to that hypothesis by another evidence set is robustly tracked by a difference in focused correlations of those evidence sets on that hypothesis, provided that all the individual pieces of evidence are equally, positively relevant …Read more
-
55Rational acceptance and conjunctive/disjunctive absorptionJournal of Logic, Language and Information 15 (1-2): 49-63. 2006.A bounded formula is a pair consisting of a propositional formula φ in the first coordinate and a real number within the unit interval in the second coordinate, interpreted to express the lower-bound probability of φ. Converting conjunctive/disjunctive combinations of bounded formulas to a single bounded formula consisting of the conjunction/disjunction of the propositions occurring in the collection along with a newly calculated lower probability is called absorption. This paper introduces two …Read more
-
67A Resource-bounded Default LogicIn J. Delgrande & T. Schaub (eds.), Proceedings of NMR 2004, Aaai. 2004.This paper presents statistical default logic, an expansion of classical (i.e., Reiter) default logic that allows us to model common inference patterns found in standard inferential statistics, including hypothesis testing and the estimation of a populations mean, variance and proportions. The logic replaces classical defaults with ordered pairs consisting of a Reiter default in the first coordinate and a real number within the unit interval in the second coordinate. This real number represents a…Read more
Gregory Wheeler
Frankfurt School Of Finance And Management
-
Frankfurt School Of Finance And ManagementProfessor
Areas of Specialization
Probabilistic Frameworks |
Machine Learning |
Formal Epistemology |