Gregory Wheeler

Frankfurt School Of Finance And Management
  •  970
    Coherence and Confirmation through Causation
    Mind 122 (485): 135-170. 2013.
    Coherentism maintains that coherent beliefs are more likely to be true than incoherent beliefs, and that coherent evidence provides more confirmation of a hypothesis when the evidence is made coherent by the explanation provided by that hypothesis. Although probabilistic models of credence ought to be well-suited to justifying such claims, negative results from Bayesian epistemology have suggested otherwise. In this essay we argue that the connection between coherence and confirmation should be …Read more
  •  733
    Scoring Imprecise Credences: A Mildly Immodest Proposal
    Philosophy and Phenomenological Research 92 (1): 55-78. 2016.
    Jim Joyce argues for two amendments to probabilism. The first is the doctrine that credences are rational, or not, in virtue of their accuracy or “closeness to the truth” (1998). The second is a shift from a numerically precise model of belief to an imprecise model represented by a set of probability functions (2010). We argue that both amendments cannot be satisfied simultaneously. To do so, we employ a (slightly-generalized) impossibility theorem of Seidenfeld, Schervish, and Kadane (2012), wh…Read more
  •  633
    Is there a logic of information?
    Journal of Theoretical and Applied Artificial Intelligence 27 (1): 95-98. 2015.
    Information-based epistemology maintains that ‘being informed’ is an independent cognitive state that cannot be reduced to knowledge or to belief, and the modal logic KTB has been proposed as a model. But what distinguishes the KTB analysis of ‘being informed’, the Brouwersche schema (B), is precisely its downfall, for no logic of information should include (B) and, more generally, no epistemic logic should include (B), either.
  •  573
    Demystifying Dilation
    Erkenntnis 79 (6): 1305-1342. 2014.
    Dilation occurs when an interval probability estimate of some event E is properly included in the interval probability estimate of E conditional on every event F of some partition, which means that one’s initial estimate of E becomes less precise no matter how an experiment turns out. Critics maintain that dilation is a pathological feature of imprecise probability models, while others have thought the problem is with Bayesian updating. However, two points are often overlooked: (1) knowing that …Read more
  •  448
    Changing use of formal methods in philosophy: late 2000s vs. late 2010s
    with Samuel C. Fletcher, Joshua Knobe, and Brian Allan Woodcock
    Synthese 199 (5-6): 14555-14576. 2021.
    Traditionally, logic has been the dominant formal method within philosophy. Are logical methods still dominant today, or have the types of formal methods used in philosophy changed in recent times? To address this question, we coded a sample of philosophy papers from the late 2000s and from the late 2010s for the formal methods they used. The results indicate that the proportion of papers using logical methods remained more or less constant over that time period but the proportion of papers usin…Read more
  •  412
    Recent debate about the error theory has taken a ‘formal turn’. On the one hand, there are those who argue that the error theory should be rejected because of its difficulties in providing a convincing formal account of the logic and semantics of moral claims. On the other hand, there are those who claim that such formal objections fail, maintaining that arguments against the error theory must be of a substantive rather than a formal kind. In this paper, we argue that formal objections to the er…Read more
  •  296
    A sound and complete axiomatization of two tabloid blogs is presented, Leiter Logic (KB) and Deontic Leiter Logic (KDB), the latter of which can be extended to Shame Game Logic for multiple agents. The (B) schema describes the mechanism behind this class of tabloids, and illustrates the perils of interpreting a provability operator as an epistemic modal. To mark this difference, and to avoid sullying Brouwer's good name, the (B) schema for epistemic modals should be called the Blog Schema.
  •  280
    Epistemic decision theory (EDT) employs the mathematical tools of rational choice theory to justify epistemic norms, including probabilism, conditionalization, and the Principal Principle, among others. Practitioners of EDT endorse two theses: (1) epistemic value is distinct from subjective preference, and (2) belief and epistemic value can be numerically quantified. We argue the first thesis, which we call epistemic puritanism, undermines the second.
  •  234
    Machine Epistemology and Big Data
    In Lee C. McIntyre & Alexander Rosenberg (eds.), The Routledge Companion to Philosophy of Social Science, Routledge. 2016.
    In the age of big data and a machine epistemology that can anticipate, predict, and intervene on events in our lives, the problem once again is that a few individuals possess the knowledge of how to regulate these activities. But the question we face now is not how to share such knowledge more widely, but rather of how to enjoy the public benefits bestowed by this knowledge without freely sharing it. It is not merely personal privacy that is at stake but a range of unsung benefits that come from i…Read more
  •  216
    Dilation and Asymmetric Relevance
    Proceedings of Machine Learning Research 103 324-26. 2019.
    A characterization result of dilation in terms of positive and negative association admits an extremal counterexample, which we present together with a minor repair of the result. Dilation may be asymmetric whereas covariation itself is symmetric. Dilation is still characterized in terms of positive and negative covariation, however, once the event to be dilated has been specified.
  •  215
    Moving Beyond Sets of Probabilities
    Statistical Science 36 (2): 201--204. 2021.
    The theory of lower previsions is designed around the principles of coherence and sure-loss avoidance, thus steers clear of all the updating anomalies highlighted in Gong and Meng's "Judicious Judgment Meets Unsettling Updating: Dilation, Sure Loss, and Simpson's Paradox" except dilation. In fact, the traditional problem with the theory of imprecise probability is that coherent inference is too complicated rather than unsettling. Progress has been made simplifying coherent inference by demotin…Read more
  •  195
    In this chapter we draw connections between two seemingly opposing approaches to probability and statistics: evidential probability on the one hand and objective Bayesian epistemology on the other
  •  192
    Discounting Desirable Gambles
    Proceedings of Machine Learning Research 147 331-341. 2021.
    The desirable gambles framework offers the most comprehensive foundations for the theory of lower pre- visions, which in turn affords the most general ac- count of imprecise probabilities. Nevertheless, for all its generality, the theory of lower previsions rests on the notion of linear utility. This commitment to linearity is clearest in the coherence axioms for sets of desirable gambles. This paper considers two routes to relaxing this commitment. The first preserves the additive structure of …Read more
  •  188
    Error, Consistency and Triviality
    Noûs 56 (3): 602-618. 2022.
    In this paper, we present a new semantic challenge to the moral error theory. Its first component calls upon moral error theorists to deliver a deontic semantics that is consistent with the error-theoretic denial of moral truths by returning the truth-value false to all moral deontic sentences. We call this the ‘consistency challenge’ to the moral error theory. Its second component demands that error theorists explain in which way moral deontic assertions can be seen to differ in meaning despite…Read more
  •  184
    Focused Correlation, Confirmation, and the Jigsaw Puzzle of Variable Evidence
    with Maximilian Schlosshauer
    Philosophy of Science 78 (3): 376-92. 2011.
    Focused correlation compares the degree of association within an evidence set to the degree of association in that evidence set given that some hypothesis is true. A difference between the confirmation lent to a hypothesis by one evidence set and the confirmation lent to that hypothesis by another evidence set is robustly tracked by a difference in focused correlations of those evidence sets on that hypothesis, provided that all the individual pieces of evidence are equally, positively relevant …Read more
  •  177
    A Review of the Lottery Paradox
    In William Harper & Gregory Wheeler (eds.), Probability and Inference: Essays in Honour of Henry E. Kyburg, Jr, College Publications. 2007.
    Henry Kyburg’s lottery paradox (1961, p. 197) arises from considering a fair 1000 ticket lottery that has exactly one winning ticket. If this much is known about the execution of the lottery it is therefore rational to accept that one ticket will win. Suppose that an event is very likely if the probability of its occurring is greater than 0.99. On these grounds it is presumed rational to accept the proposition that ticket 1 of the lottery will not win. Since the lottery is fair, it is rational t…Read more
  •  171
    Epistemic naturalism holds that the results or methodologies from the cognitive sciences are relevant to epistemology, and some have maintained that scientific methods are more compatible with externalist theories of justification than with internalist theories. But practically all discussions about naturalized epistemology are framed exclusively in terms of cognitive psychology, which is only one of the cognitive sciences. The question addressed in this essay is whether a commitment to naturali…Read more
  •  164
    Models, Models, and Models
    Metaphilosophy 44 (3): 293-300. 2013.
    Michael Dummett famously maintained that analytic philosophy was simply philosophy that followed Frege in treating the philosophy of language as the basis for all other philosophy (1978, 441). But one important insight to emerge from computer science is how difficult it is to animate the linguistic artifacts that the analysis of thought produces. Yet, modeling the effects of thought requires a new skill that goes beyond analysis: procedural literacy. Some of the most promising research in philos…Read more
  •  155
    Two compelling principles, the Reasonable Range Principle and the Preservation of Irrelevant Evidence Principle, are necessary conditions that any response to peer disagreements ought to abide by. The Reasonable Range Principle maintains that a resolution to a peer disagreement should not fall outside the range of views expressed by the peers in their dispute, whereas the Preservation of Irrelevant Evidence Principle maintains that a resolution strategy should be able to preserve unanimous judgm…Read more
  •  154
    Why the Hardest Logic Puzzle Ever Cannot Be Solved in Less than Three Questions
    with Pedro Barahona
    Journal of Philosophical Logic 41 (2): 493-503. 2012.
    Rabern and Rabern (Analysis 68:105–112 2 ) and Uzquiano (Analysis 70:39–44 4 ) have each presented increasingly harder versions of ‘the hardest logic puzzle ever’ (Boolos The Harvard Review of Philosophy 6:62–65 1 ), and each has provided a two-question solution to his predecessor’s puzzle. But Uzquiano’s puzzle is different from the original and different from Rabern and Rabern’s in at least one important respect: it cannot be solved in less than three questions. In this paper we solve Uzquiano…Read more
  •  148
    On the imprecision of full conditional probabilities
    with Fabio G. Cozman
    Synthese 199 (1-2): 3761-3782. 2021.
    The purpose of this paper is to show that if one adopts conditional probabilities as the primitive concept of probability, one must deal with the fact that even in very ordinary circumstances at least some probability values may be imprecise, and that some probability questions may fail to have numerically precise answers.
  •  145
    Objective Bayesian Calibration and the Problem of Non-convex Evidence
    British Journal for the Philosophy of Science 63 (4): 841-850. 2012.
    Jon Williamson's Objective Bayesian Epistemology relies upon a calibration norm to constrain credal probability by both quantitative and qualitative evidence. One role of the calibration norm is to ensure that evidence works to constrain a convex set of probability functions. This essay brings into focus a problem for Williamson's theory when qualitative evidence specifies non-convex constraints.
  •  128
    Focused correlation and confirmation
    British Journal for the Philosophy of Science 60 (1): 79-100. 2009.
    This essay presents results about a deviation from independence measure called focused correlation . This measure explicates the formal relationship between probabilistic dependence of an evidence set and the incremental confirmation of a hypothesis, resolves a basic question underlying Peter Klein and Ted Warfield's ‘truth-conduciveness’ problem for Bayesian coherentism, and provides a qualified rebuttal to Erik Olsson's claim that there is no informative link between correlation and confirmati…Read more
  •  121
    Conditionals and consequences
    with Henry E. Kyburg and Choh Man Teng
    Journal of Applied Logic 5 (4): 638-650. 2007.
    We examine the notion of conditionals and the role of conditionals in inductive logics and arguments. We identify three mistakes commonly made in the study of, or motivation for, non-classical logics. A nonmonotonic consequence relation based on evidential probability is formulated. With respect to this acceptance relation some rules of inference of System P are unsound, and we propose refinements that hold in our framework.
  •  121
    Epistemology and Artificial Intelligence
    with Luis Moniz Pereira
    Journal of Applied Logic 2 (4): 469-93. 2004.
    In this essay we advance the view that analytical epistemology and artificial intelligence are complementary disciplines. Both fields study epistemic relations, but whereas artificial intelligence approaches this subject from the perspective of understanding formal and computational properties of frameworks purporting to model some epistemic relation or other, traditional epistemology approaches the subject from the perspective of understanding the properties of epistemic relations in terms of thei…Read more
  •  120
    Both dilation and non-conglomerability have been alleged to conflict with a fundamental principle of Bayesian methodology that we call \textit{Good's Principle}: one should always delay making a terminal decision between alternative courses of action if given the opportunity to first learn, at zero cost, the outcome of an experiment relevant to the decision. In particular, both dilation and non-conglomerability have been alleged to permit or even mandate choosing to make a terminal decision in …Read more
  •  120
    Additionally, the text shows how to develop computationally feasible methods to mesh with this framework.
  •  119
    A Gentle Approach to Imprecise Probabilities
    In Thomas Augustin, Fabio Gagliardi Cozman & Gregory Wheeler (eds.), Reflections on the Foundations of Probability and Statistics: Essays in Honor of Teddy Seidenfeld, Springer. pp. 37-67. 2022.
    The field of of imprecise probability has matured, in no small part because of Teddy Seidenfeld’s decades of original scholarship and essential contributions to building and sustaining the ISIPTA community. Although the basic idea behind imprecise probability is (at least) 150 years old, a mature mathematical theory has only taken full form in the last 30 years. Interest in imprecise probability during this period has also grown, but many of the ideas that the mature theory serves can be diffic…Read more
  •  115
    Error statistics and Duhem's problem
    Philosophy of Science 67 (3): 410-420. 2000.
    No one has a well developed solution to Duhem's problem, the problem of how experimental evidence warrants revision of our theories. Deborah Mayo proposes a solution to Duhem's problem in route to her more ambitious program of providing a philosophical account of inductive inference and experimental knowledge. This paper is a response to Mayo's Error Statistics (ES) program, paying particular attention to her response to Duhem's problem. It turns out that Mayo's purported solution to Duhem's pro…Read more