•  59
    On the dilemma for partial subjunctive supposition
    Analysis 84 (3): 576-592. 2024.
    In ‘The logic of partial supposition’, Eva and Hartmann present a dilemma for a normative account of partial subjunctive supposition: the natural subjunctive analogue of Jeffrey conditionalization is Jeffrey imaging, but this rule violates a natural monotonicity constraint. This paper offers a partial defence of Jeffrey imaging against Eva and Hartmann’s objection. I show that, although Jeffrey imaging is non-monotonic in Eva and Hartmann’s sense, it is what I call status quo monotonic. A status…Read more
  •  87
    Approximate rationality and ideal rationality
    Asian Journal of Philosophy 3 (2): 1-11. 2024.
    According to approximate Bayesianism, Bayesian norms are ideal norms worthy of approximation for non-ideal agents. This paper discusses one potential challenge for approximate Bayesianism: in non-transparent learning situations—situations where the agent does not learn what they have or have not learnt—it is unclear that the Bayesian norms are worth satisfying, let alone approximating. I discuss two replies to this challenge and find neither satisfactory. I suggest that what transpires is a gene…Read more
  •  96
    Jeffrey Meets Kolmogorov: A General Theory of Conditioning
    Journal of Philosophical Logic 49 (5): 941-979. 2020.
    Jeffrey conditionalization is a rule for updating degrees of belief in light of uncertain evidence. It is usually assumed that the partitions involved in Jeffrey conditionalization are finite and only contain positive-credence elements. But there are interesting examples, involving continuous quantities, in which this is not the case. Q1 Can Jeffrey conditionalization be generalized to accommodate continuous cases? Meanwhile, several authors, such as Kenny Easwaran and Michael Rescorla, have bee…Read more
  •  154
    Kolmogorov Conditionalizers Can Be Dutch Booked
    Review of Symbolic Logic 1-36. forthcoming.
    A vexing question in Bayesian epistemology is how an agent should update on evidence which she assigned zero prior credence. Some theorists have suggested that, in such cases, the agent should update by Kolmogorov conditionalization, a norm based on Kolmogorov’s theory of regular conditional distributions. However, it turns out that in some situations, a Kolmogorov conditionalizer will plan to always assign a posterior credence of zero to the evidence she learns. Intuitively, such a plan is irra…Read more
  •  70
    The Borel-Kolmogorov paradox is often presented as an obscure problem that certain mathematical accounts of conditional probability must face. In this article, we point out that the paradox arises in the physical sciences, for physical probability or chance. By carefully formulating the paradox in this setting, we show that it is a puzzle for everyone, regardless of one’s preferred probability formalism. We propose a treatment that is inspired by the approach that scientists took when confronted…Read more