Gustavo Cevolani

IMT School For Advanced Studies Lucca
  •  1822
    Giochi, dilemmi sociali e scelte collettive
    In Anthony de Jasay (ed.), Scelta, Contratto, Consenso, Rubbettino/leonardo Facco. pp. 13--56. 2008.
    This is the introductory essay to the Italian translation of Anthony de Jasay's "Choice, contract, and consent. A restatement of liberalism".
  •  1112
    Starting from the sixties of the past century theory change has become a main concern of philosophy of science. Two of the best known formal accounts of theory change are the post-Popperian theories of verisimilitude (PPV for short) and the AGM theory of belief change (AGM for short). In this paper, we will investigate the conceptual relations between PPV and AGM and, in particular, we will ask whether the AGM rules for theory change are effective means for approaching the truth, i.e., for achi…Read more
  •  624
    Hayek in the lab. Austrian School, game theory, and experimental economics
    Logic and Philosophy of Science 9 (1): 429-436. 2011.
    Focusing on the work of Friedrich von Hayek and Vernon Smith, we discuss some conceptual links between Austrian economics and recent work in behavioral game theory and experimental economics. After a brief survey of the main methodological aspects of Austrian and experimental economics, we suggest that common views on subjectivism, individualism, and the role of qualitative explanations and predictions in social science may favour a fruitful interaction between these two research programs.
  •  450
    Strongly semantic information and verisimilitude
    Ethics and Politics (2): 159-179. 2011.
    In The Philosophy of Information, Luciano Floridi presents a theory of “strongly semantic information”, based on the idea that “information encapsulates truth” (the so-called “veridicality thesis”). Starting with Popper, philosophers of science have developed different explications of the notion of verisimilitude or truthlikeness, construed as a combination of truth and information. Thus, the theory of strongly semantic information and the theory of verisimilitude are intimately tied. Yet, with …Read more
  •  421
    According to the so-called Lockean thesis, a rational agent believes a proposition just in case its probability is sufficiently high, i.e., greater than some suitably fixed threshold. The Preface paradox is usually taken to show that the Lockean thesis is untenable, if one also assumes that rational agents should believe the conjunction of their own beliefs: high probability and rational belief are in a sense incompatible. In this paper, we show that this is not the case in general. More precise…Read more
  •  228
    In this paper we provide a compact presentation of the verisimilitudinarian approach to scientific progress (VS, for short) and defend it against the sustained attack recently mounted by Alexander Bird (2007). Advocated by such authors as Ilkka Niiniluoto and Theo Kuipers, VS is the view that progress can be explained in terms of the increasing verisimilitude (or, equivalently, truthlikeness, or approximation to the truth) of scientific theories. According to Bird, VS overlooks the central issue…Read more
  •  220
    Truth approximation via abductive belief change
    Logic Journal of the IGPL 21 (6): 999-1016. 2013.
    We investigate the logical and conceptual connections between abductive reasoning construed as a process of belief change, on the one hand, and truth approximation, construed as increasing (estimated) verisimilitude, on the other. We introduce the notion of ‘(verisimilitude-guided) abductive belief change’ and discuss under what conditions abductively changing our theories or beliefs does lead them closer to the truth, and hence tracks truth approximation conceived as the main aim of inquiry…Read more
  •  176
    Theory change is a central concern in contemporary epistemology and philosophy of science. In this paper, we investigate the relationships between two ongoing research programs providing formal treatments of theory change: the (post-Popperian) approach to verisimilitude and the AGM theory of belief change. We show that appropriately construed accounts emerging from those two lines of epistemological research do yield convergences relative to a specified kind of theories, here labeled “conjunctiv…Read more
  •  146
    In this paper, we address the problem of truth approximation through theory change, asking whether revising our theories by newly acquired data leads us closer to the truth about a given domain. More particularly, we focus on “nomic conjunctive theories”, i.e., theories expressed as conjunctions of logically independent statements concerning the physical or, more generally, nomic possibilities and impossibilities of the domain under inquiry. We define both a comparative and a quantitative notion…Read more
  •  121
    Review of Olsson, Erik J. and Enqvist, Sebastian , Belief Revision meets Philosophy of Science
  •  118
    Probability, Approximate Truth, and Truthlikeness: More Ways out of the Preface Paradox
    Australasian Journal of Philosophy 95 (2): 209-225. 2017.
    The so-called Preface Paradox seems to show that one can rationally believe two logically incompatible propositions. We address this puzzle, relying on the notions of truthlikeness and approximate truth as studied within the post-Popperian research programme on verisimilitude. In particular, we show that adequately combining probability, approximate truth, and truthlikeness leads to an explanation of how rational belief is possible in the face of the Preface Paradox. We argue that our account is…Read more
  •  100
    Fallibilism, Verisimilitude, and the Preface Paradox
    Erkenntnis 82 (1): 169-183. 2017.
    The Preface Paradox apparently shows that it is sometimes rational to believe logically incompatible propositions. In this paper, I propose a way out of the paradox based on the ideas of fallibilism and verisimilitude. More precisely, I defend the view that a rational inquirer can fallibly believe or accept a proposition which is false, or likely false, but verisimilar; and I argue that this view makes the Preface Paradox disappear. Some possible objections to my proposal, and an alternative vie…Read more
  •  100
    Truth may not explain predictive success, but truthlikeness does
    Studies in History and Philosophy of Science Part A 44 (4): 590-593. 2013.
    In a recent paper entitled “Truth does not explain predictive success” , Carsten Held argues that the so-called “No-Miracles Argument” for scientific realism is easily refuted when the consequences of the underdetermination of theories by the evidence are taken into account. We contend that the No-Miracles Argument, when it is deployed within the context of sophisticated versions of realism, based on the notion of truthlikeness , survives Held’s criticism unscathed
  •  96
    Truth approximation, belief merging, and peer disagreement
    Synthese 191 (11): 2383-2401. 2014.
    In this paper, we investigate the problem of truth approximation via belief merging, i.e., we ask whether, and under what conditions, a group of inquirers merging together their beliefs makes progress toward the truth about the underlying domain. We answer this question by proving some formal results on how belief merging operators perform with respect to the task of truth approximation, construed as increasing verisimilitude or truthlikeness. Our results shed new light on the issue of how ratio…Read more
  •  93
    Defending De-idealization in Economic Modeling: A Case Study
    Sage Publications Inc: Philosophy of the Social Sciences 52 (1-2): 25-52. 2021.
    This paper defends the viability of de-idealization strategies in economic modeling against recent criticism. De-idealization occurs when an idealized assumption of a theoretical model is replaced with a more realistic one. Recently, some scholars have raised objections against the possibility or fruitfulness of de-idealizing economic models, suggesting that economists do not employ this kind of strategy. We present a detailed case study from the theory of industrial organization, discussing thr…Read more
  •  92
    Searching for information is critical in many situations. In medicine, for instance, careful choice of a diagnostic test can help narrow down the range of plausible diseases that the patient might have. In a probabilistic framework, test selection is often modeled by assuming that people's goal is to reduce uncertainty about possible states of the world. In cognitive science, psychology, and medical decision making, Shannon entropy is the most prominent and most widely used model to formalize pr…Read more
  •  76
    Why Adding Truths Is Not Enough: A Reply to Mizrahi on Progress as Approximation to the Truth
    International Studies in the Philosophy of Science 32 (2): 129-135. 2019.
    In a recent paper in this journal, entitled ‘Scientific Progress: Why Getting Closer to Truth is Not Enough’ (2017), Moti Mizrahi argues that the view of progress as approximation to the truth or increasing verisimilitude is plainly false. The key premise of his argument is that on such a view of progress, in order to get closer to the truth one only needs to arbitrarily add a true disjunct to a hypothesis or theory. Since quite clearly scientific progress is not a matter of adding true disjunct…Read more
  •  63
    Simple Models in Complex Worlds: Occam’s Razor and Statistical Learning Theory
    with Falco J. Bargagli Stoffi and Giorgio Gnecco
    Minds and Machines 32 (1): 13-42. 2022.
    The idea that “simplicity is a sign of truth”, and the related “Occam’s razor” principle, stating that, all other things being equal, simpler models should be preferred to more complex ones, have been long discussed in philosophy and science. We explore these ideas in the context of supervised machine learning, namely the branch of artificial intelligence that studies algorithms which balance simplicity and accuracy in order to effectively learn about the features of the underlying domain. Focus…Read more
  •  54
    Popper’s original definition of truthlikeness relied on a central insight: that truthlikeness combines truth and information, in the sense that a proposition is closer to the truth the more true consequences and the less false consequences it entails. As intuitively compelling as this definition may be, it is untenable, as proved long ago; still, one can arguably rely on Popper’s intuition to provide an adequate account of truthlikeness. To this aim, we mobilize some classical work on partial en…Read more
  •  54
    This paper contributes to the debate on the question of whether a systematic connection obtains between one’s commitment to realism or antirealism and one’s attitude towards the possibility of radical theoretical novelty, namely, theory change affecting our best, most successful theories (see, e.g., Stanford in Synthese 196:3915–3932, 2019; Dellsén in Stud Hist Philos Sci 76:30–38, 2019). We argue that it is not allegiance to realism or antirealism as such that primarily dictates one’s response …Read more
  •  45
    We explore the grammar of Bayesian confirmation by focusing on some likelihood principles, including the Weak Law of Likelihood. We show that none of the likelihood principles proposed so far is satisfied by all incremental measures of confirmation, and we argue that some of these measures indeed obey new, prima facie strange, antilikelihood principles. To prove this, we introduce a new measure that violates the Weak Law of Likelihood while satisfying a strong antilikelihood condition. We conclu…Read more
  •  44
    Reverse inference is a crucial inferential strategy used in cognitive neuroscience to derive conclusions about the engagement of cognitive processes from patterns of brain activation. While widely employed in experimental studies, it is now viewed with increasing scepticism within the neuroscience community. One problem with reverse inference is that it is logically invalid, being an instance of abduction in Peirce’s sense. In this paper, we offer the first systematic analysis of reverse inferen…Read more
  •  42
    After the demise of logical empiricism in the late fifties of the past century, philosophy of science entered a sort of Kuhnian revolutionary phase. Both its central problems and the methods used to address them underwent a profound change; under the pressure of the “new” philosophy of science—and of the various historical, sociological, cultural, or feminist approaches—the way of doing philosophy championed by Carnap and Popper was progressively abandoned by many scholars interested in the stud…Read more
  •  41
    Approaching Truth in Conceptual Spaces
    Erkenntnis 85 (6): 1485-1500. 2020.
    Knowledge representation is a central issue in a number of areas, but few attempts are usually made to bridge different approaches accross different fields. As a contribution in this direction, in this paper I focus on one such approach, the theory of conceptual spaces developed within cognitive science, and explore its potential applications in the fields of philosophy of science and formal epistemology. My case-study is provided by the theory of truthlikeness, construed as closeness to “the wh…Read more
  •  36
    A partial consequence account of truthlikeness
    Synthese 197 (4): 1627-1646. 2020.
    Popper’s original definition of truthlikeness relied on a central insight: that truthlikeness combines truth and information, in the sense that a proposition is closer to the truth the more true consequences and the less false consequences it entails. As intuitively compelling as this definition may be, it is untenable, as proved long ago; still, one can arguably rely on Popper’s intuition to provide an adequate account of truthlikeness. To this aim, we mobilize some classical work on partial en…Read more
  •  30
    Carnapian truthlikeness
    Logic Journal of the IGPL 24 (4): 542-556. 2016.
    Theories of truthlikeness (or verisimilitude) are currently being classified according to two independent distinctions: that between ‘content’ and ‘likeness’ accounts, and that between ‘conjunctive’ and ‘disjunctive’ ones. In this article, I present and discuss a new definition of truthlikeness, which employs Carnap’s notion of the content elements entailed by a theory or proposition, and is then labelled ‘Carnapian’. After studying in detail the properties and shortcomings of this definition, I…Read more
  •  28
    This paper contributes to the debate on the question of whether a systematic connection obtains between one’s commitment to realism or antirealism and one’s attitude towards the possibility of radical theoretical novelty, namely, theory change affecting our best, most successful theories (see, e.g., Stanford in Synthese 196:3915–3932, 2019; Dellsén in Stud Hist Philos Sci 76:30–38, 2019). We argue that it is not allegiance to realism or antirealism as such that primarily dictates one’s response …Read more
  •  27
    Defending De-idealization in Economic Modeling: A Case Study
    Philosophy of the Social Sciences 52 (1-2): 25-52. 2022.
    This paper defends the viability of de-idealization strategies in economic modeling against recent criticism. De-idealization occurs when an idealized assumption of a theoretical model is replaced with a more realistic one. Recently, some scholars have raised objections against the possibility or fruitfulness of de-idealizing economic models, suggesting that economists do not employ this kind of strategy. We present a detailed case study from the theory of industrial organization, discussing thr…Read more
  •  27
    After Karl Popper’s original work, several approaches were developed to provide a sound explication of the notion of verisimilitude. With few exceptions, these contributions have assumed that the truth to be approximated is deterministic. This collection of ten papers addresses the more general problem of approaching probabilistic truths. They include attempts to find appropriate measures for the closeness to probabilistic truth and to evaluate claims about such distances on the basis of empiric…Read more
  •  25
    We examine the relationship between scientific knowledge and the legal system with a focus on the exclusion of expert testimony from trial as ruled by the Daubert standard in the US.We introduce a simple framework to understand and assess the role of judges as “gatekeepers”, monitoring the admission of science in the courtroom. We show how judges face a crucial choice, namely, whether to limit Daubert assessment to the abstract reliability of the methods used by the expert witness or also to che…Read more