Gustavo Cevolani

IMT School For Advanced Studies Lucca
  •  2
    We examine the relationship between scientific knowledge and the legal system with a focus on the exclusion of expert testimony from trial as ruled by the Daubert standard in the US.We introduce a simple framework to understand and assess the role of judges as “gatekeepers”, monitoring the admission of science in the courtroom. We show how judges face a crucial choice, namely, whether to limit Daubert assessment to the abstract reliability of the methods used by the expert witness or also to che…Read more
  •  172
    Theory change is a central concern in contemporary epistemology and philosophy of science. In this paper, we investigate the relationships between two ongoing research programs providing formal treatments of theory change: the (post-Popperian) approach to verisimilitude and the AGM theory of belief change. We show that appropriately construed accounts emerging from those two lines of epistemological research do yield convergences relative to a specified kind of theories, here labeled “conjunctiv…Read more
  •  116
    Review of Olsson, Erik J. and Enqvist, Sebastian , Belief Revision meets Philosophy of Science
  •  143
    In this paper, we address the problem of truth approximation through theory change, asking whether revising our theories by newly acquired data leads us closer to the truth about a given domain. More particularly, we focus on “nomic conjunctive theories”, i.e., theories expressed as conjunctions of logically independent statements concerning the physical or, more generally, nomic possibilities and impossibilities of the domain under inquiry. We define both a comparative and a quantitative notion…Read more
  •  16
    Starting with Popper, philosophers and logicians have proposed different accounts of verisimilitude or truthlikeness. One way of classifying such accounts is to distinguish between “conjunctive” and “disjunctive” ones. In this paper, we focus on our own “basic feature” approach to verisimilitude, which naturally belongs to the conjunctive family. We start by surveying the landscape of conjunctive accounts; then, we introduce two new measures of verisimilitude and discuss their properties; finall…Read more
  •  47
    This paper contributes to the debate on the question of whether a systematic connection obtains between one’s commitment to realism or antirealism and one’s attitude towards the possibility of radical theoretical novelty, namely, theory change affecting our best, most successful theories (see, e.g., Stanford in Synthese 196:3915–3932, 2019; Dellsén in Stud Hist Philos Sci 76:30–38, 2019). We argue that it is not allegiance to realism or antirealism as such that primarily dictates one’s response …Read more
  •  22
    This paper contributes to the debate on the question of whether a systematic connection obtains between one’s commitment to realism or antirealism and one’s attitude towards the possibility of radical theoretical novelty, namely, theory change affecting our best, most successful theories (see, e.g., Stanford in Synthese 196:3915–3932, 2019; Dellsén in Stud Hist Philos Sci 76:30–38, 2019). We argue that it is not allegiance to realism or antirealism as such that primarily dictates one’s response …Read more
  •  19
    After Karl Popper’s original work, several approaches were developed to provide a sound explication of the notion of verisimilitude. With few exceptions, these contributions have assumed that the truth to be approximated is deterministic. This collection of ten papers addresses the more general problem of approaching probabilistic truths. They include attempts to find appropriate measures for the closeness to probabilistic truth and to evaluate claims about such distances on the basis of empiric…Read more
  •  24
    Defending De-idealization in Economic Modeling: A Case Study
    Philosophy of the Social Sciences 52 (1-2): 25-52. 2022.
    This paper defends the viability of de-idealization strategies in economic modeling against recent criticism. De-idealization occurs when an idealized assumption of a theoretical model is replaced with a more realistic one. Recently, some scholars have raised objections against the possibility or fruitfulness of de-idealizing economic models, suggesting that economists do not employ this kind of strategy. We present a detailed case study from the theory of industrial organization, discussing thr…Read more
  •  39
    Simple Models in Complex Worlds: Occam’s Razor and Statistical Learning Theory
    with Falco J. Bargagli Stoffi and Giorgio Gnecco
    Minds and Machines 32 (1): 13-42. 2022.
    The idea that “simplicity is a sign of truth”, and the related “Occam’s razor” principle, stating that, all other things being equal, simpler models should be preferred to more complex ones, have been long discussed in philosophy and science. We explore these ideas in the context of supervised machine learning, namely the branch of artificial intelligence that studies algorithms which balance simplicity and accuracy in order to effectively learn about the features of the underlying domain. Focus…Read more
  •  37
    Reverse inference is a crucial inferential strategy used in cognitive neuroscience to derive conclusions about the engagement of cognitive processes from patterns of brain activation. While widely employed in experimental studies, it is now viewed with increasing scepticism within the neuroscience community. One problem with reverse inference is that it is logically invalid, being an instance of abduction in Peirce’s sense. In this paper, we offer the first systematic analysis of reverse inferen…Read more
  •  18
    Defending De-idealization in Economic Modeling: A Case Study
    Sage Publications Inc: Philosophy of the Social Sciences 52 (1-2): 25-52. 2021.
    This paper defends the viability of de-idealization strategies in economic modeling against recent criticism. De-idealization occurs when an idealized assumption of a theoretical model is replaced with a more realistic one. Recently, some scholars have raised objections against the possibility or fruitfulness of de-idealizing economic models, suggesting that economists do not employ this kind of strategy. We present a detailed case study from the theory of industrial organization, discussing thr…Read more
  •  19
    Multiple discoveries, inevitability, and scientific realism
    Studies in History and Philosophy of Science Part A 90 (December 2021): 30-38. 2021.
    When two or more (groups of) researchers independently investigating the same domain arrive at the same result, a multiple discovery occurs. The pervasiveness of multiple discoveries in science suggests the intuition that they are in some sense inevitable—that one should view them as results that force themselves upon us, so to speak. We argue that, despite the intuitive force of such an “inevitabilist insight,” one should reject it. More specifically, we distinguish two facets of the insight an…Read more
  •  19
    The basic problem of a theory of truth approximation is defining when a theory is “close to the truth” about some relevant domain. Existing accounts of truthlikeness or verisimilitude address this problem, but are usually limited to the problem of approaching a “deterministic” truth by means of deterministic theories. A general theory of truth approximation, however, should arguably cover also cases where either the relevant theories, or “the truth”, or both, are “probabilistic” in nature. As a …Read more
  •  21
    Ethical and cognitive challenges in the COVID-19 emergency
    with Chiara Lucifora
    Rivista Internazionale di Filosofia e Psicologia 11 (3): 327-340. 2020.
    The global emergency caused by the spread of COVID-19 raises critical challenges for individuals and communities on many different levels. In particular, politicians, scientists, physicians, and other professionals may face new ethical dilemmas and cognitive constraints as they make critical decisions in extraordinary circumstances. Philosophers and cognitive scientists have long analyzed and discussed such issues. An example is the debate on moral decision making in imaginary scenarios, such as…Read more
  •  407
    According to the so-called Lockean thesis, a rational agent believes a proposition just in case its probability is sufficiently high, i.e., greater than some suitably fixed threshold. The Preface paradox is usually taken to show that the Lockean thesis is untenable, if one also assumes that rational agents should believe the conjunction of their own beliefs: high probability and rational belief are in a sense incompatible. In this paper, we show that this is not the case in general. More precise…Read more
  •  31
    A partial consequence account of truthlikeness
    Synthese 197 (4): 1627-1646. 2020.
    Popper’s original definition of truthlikeness relied on a central insight: that truthlikeness combines truth and information, in the sense that a proposition is closer to the truth the more true consequences and the less false consequences it entails. As intuitively compelling as this definition may be, it is untenable, as proved long ago; still, one can arguably rely on Popper’s intuition to provide an adequate account of truthlikeness. To this aim, we mobilize some classical work on partial en…Read more
  •  85
    Searching for information is critical in many situations. In medicine, for instance, careful choice of a diagnostic test can help narrow down the range of plausible diseases that the patient might have. In a probabilistic framework, test selection is often modeled by assuming that people's goal is to reduce uncertainty about possible states of the world. In cognitive science, psychology, and medical decision making, Shannon entropy is the most prominent and most widely used model to formalize pr…Read more
  •  74
    Why Adding Truths Is Not Enough: A Reply to Mizrahi on Progress as Approximation to the Truth
    International Studies in the Philosophy of Science 32 (2): 129-135. 2019.
    In a recent paper in this journal, entitled ‘Scientific Progress: Why Getting Closer to Truth is Not Enough’ (2017), Moti Mizrahi argues that the view of progress as approximation to the truth or increasing verisimilitude is plainly false. The key premise of his argument is that on such a view of progress, in order to get closer to the truth one only needs to arbitrarily add a true disjunct to a hypothesis or theory. Since quite clearly scientific progress is not a matter of adding true disjunct…Read more
  •  43
    We explore the grammar of Bayesian confirmation by focusing on some likelihood principles, including the Weak Law of Likelihood. We show that none of the likelihood principles proposed so far is satisfied by all incremental measures of confirmation, and we argue that some of these measures indeed obey new, prima facie strange, antilikelihood principles. To prove this, we introduce a new measure that violates the Weak Law of Likelihood while satisfying a strong antilikelihood condition. We conclu…Read more
  •  35
    Approaching Truth in Conceptual Spaces
    Erkenntnis 85 (6): 1485-1500. 2020.
    Knowledge representation is a central issue in a number of areas, but few attempts are usually made to bridge different approaches accross different fields. As a contribution in this direction, in this paper I focus on one such approach, the theory of conceptual spaces developed within cognitive science, and explore its potential applications in the fields of philosophy of science and formal epistemology. My case-study is provided by the theory of truthlikeness, construed as closeness to “the wh…Read more
  •  51
    Popper’s original definition of truthlikeness relied on a central insight: that truthlikeness combines truth and information, in the sense that a proposition is closer to the truth the more true consequences and the less false consequences it entails. As intuitively compelling as this definition may be, it is untenable, as proved long ago; still, one can arguably rely on Popper’s intuition to provide an adequate account of truthlikeness. To this aim, we mobilize some classical work on partial en…Read more
  •  222
    In this paper we provide a compact presentation of the verisimilitudinarian approach to scientific progress (VS, for short) and defend it against the sustained attack recently mounted by Alexander Bird (2007). Advocated by such authors as Ilkka Niiniluoto and Theo Kuipers, VS is the view that progress can be explained in terms of the increasing verisimilitude (or, equivalently, truthlikeness, or approximation to the truth) of scientific theories. According to Bird, VS overlooks the central issue…Read more
  •  40
    After the demise of logical empiricism in the late fifties of the past century, philosophy of science entered a sort of Kuhnian revolutionary phase. Both its central problems and the methods used to address them underwent a profound change; under the pressure of the “new” philosophy of science—and of the various historical, sociological, cultural, or feminist approaches—the way of doing philosophy championed by Carnap and Popper was progressively abandoned by many scholars interested in the stud…Read more
  •  94
    Truth may not explain predictive success, but truthlikeness does
    Studies in History and Philosophy of Science Part A 44 (4): 590-593. 2013.
    In a recent paper entitled “Truth does not explain predictive success” , Carsten Held argues that the so-called “No-Miracles Argument” for scientific realism is easily refuted when the consequences of the underdetermination of theories by the evidence are taken into account. We contend that the No-Miracles Argument, when it is deployed within the context of sophisticated versions of realism, based on the notion of truthlikeness , survives Held’s criticism unscathed
  •  1790
    Giochi, dilemmi sociali e scelte collettive
    In Anthony de Jasay (ed.), Scelta, Contratto, Consenso, Rubbettino/leonardo Facco. pp. 13--56. 2008.
    This is the introductory essay to the Italian translation of Anthony de Jasay's "Choice, contract, and consent. A restatement of liberalism".
  •  114
    Probability, Approximate Truth, and Truthlikeness: More Ways out of the Preface Paradox
    Australasian Journal of Philosophy 95 (2): 209-225. 2017.
    The so-called Preface Paradox seems to show that one can rationally believe two logically incompatible propositions. We address this puzzle, relying on the notions of truthlikeness and approximate truth as studied within the post-Popperian research programme on verisimilitude. In particular, we show that adequately combining probability, approximate truth, and truthlikeness leads to an explanation of how rational belief is possible in the face of the Preface Paradox. We argue that our account is…Read more
  •  1099
    Starting from the sixties of the past century theory change has become a main concern of philosophy of science. Two of the best known formal accounts of theory change are the post-Popperian theories of verisimilitude (PPV for short) and the AGM theory of belief change (AGM for short). In this paper, we will investigate the conceptual relations between PPV and AGM and, in particular, we will ask whether the AGM rules for theory change are effective means for approaching the truth, i.e., for achi…Read more
  •  440
    Strongly semantic information and verisimilitude
    Ethics and Politics (2): 159-179. 2011.
    In The Philosophy of Information, Luciano Floridi presents a theory of “strongly semantic information”, based on the idea that “information encapsulates truth” (the so-called “veridicality thesis”). Starting with Popper, philosophers of science have developed different explications of the notion of verisimilitude or truthlikeness, construed as a combination of truth and information. Thus, the theory of strongly semantic information and the theory of verisimilitude are intimately tied. Yet, with …Read more
  •  20
    Guest Editor's Preface
    Etica E Politica 15 (2): 7-13. 2013.
    Preface to a special section on "Cooperation in nature, science, and society"