•  71
    Merging Judgments and the Problem of Truth-Tracking
    In Jerome Lang & Ulle Endriss (eds.), Computational Social Choice 2006, University of Amsterdam. 2006.
    The problem of the aggregation of consistent individual judgments on logically interconnected propositions into a collective judgment on the same propositions has recently drawn much attention. The dificulty lies in the fact that a seemingly reasonable aggregation procedure, such as propositionwise majority voting, cannot ensure an equally consistent collective outcome. The literature on judgment aggregation refers to such dilemmas as the discursive paradox. So far, three procedures have been pr…Read more
  •  464
    Normativität und Bayesianismus
    In Bernward Gesang (ed.), Deskriptive oder normative Wissenschaftstheorie, Ontos-verlag. pp. 177-204. 2004.
    Das Thema dieses Bandes ist die Frage, ob die Wissenschaftstheorie eine normative Disziplin ist. Zunächst überrascht die Frage, denn für viele Wissenschaftstheoretiker ist die Antwort ein klares „Ja“; sie halten es für einen Allgemeinplatz, dass die Wissenschaftstheorie ein normatives Unternehmen ist. Bei genauerem Hinsehen stellt sich jedoch heraus, dass die Frage unterschiedliche Interpretationen zulässt, die einzeln diskutiert werden müssen. Dies geschieht im ersten Abschnitt. Im zweite…Read more
  •  67
    Models and Simulations
    Synthese 169 (3). 2009.
    Special issue. With contributions by Anouk Barberouse, Sarah Francescelli and Cyrille Imbert, Robert Batterman, Roman Frigg and Julian Reiss, Axel Gelfert, Till Grüne-Yanoff, Paul Humphreys, James Mattingly and Walter Warwick, Matthew Parker, Wendy Parker, Dirk Schlimm, and Eric Winsberg.
  •  19
    Kopenhagen contra Bohm – eine Herausforderung für den Realismus?
    with Rainer Müller
    Praxis der Naturwissenschaften - Physik 4 12-17. 1999.
    Der bedeutende amerikanische Logiker und Philosoph W.V.O. Quine hat die folgende Frage ins Zentrum seines Schaffens gestellt: "Wie kommen wir von unseren Sinnesdaten zu Theorien über die Welt?“ Bei der Beantwortung dieser Frage tritt ein grundlegendes Problem auf, das damit zusammenhängt, dass uns immer nur ein endlicher Satz an Informationen über die Welt zugänglich ist. Jedes Experiment liefert z. B. nur eine endliche Anzahl von Messpunkten.
  •  699
    Bayesian Epistemology
    In DancyJ (ed.), A Companion to Epistemology, Blackwell. 2010.
    Bayesianism is our leading theory of uncertainty. Epistemology is defined as the theory of knowledge. So “Bayesian Epistemology” may sound like an oxymoron. Bayesianism, after all, studies the properties and dynamics of degrees of belief, understood to be probabilities. Traditional epistemology, on the other hand, places the singularly non-probabilistic notion of knowledge at centre stage, and to the extent that it traffics in belief, that notion does not come in degrees. So how can there be a B…Read more
  •  1
    Proceedings of EPSA09 (edited book)
    Springer. 2012.
    This is a collection of high-quality research papers in the philosophy of science, deriving from papers presented at the second meeting of the European Philosophy of Science Association in Amsterdam, October 2009.
  •  28
    Explanation, Prediction, and Confirmation (edited book)
    with Marcel Weber, Wenceslao Gonzalez, Dennis Dieks, and Thomas Uebe
    Springer. 2011.
    This volume, the second in the Springer series Philosophy of Science in a European Perspective, contains selected papers from the workshops organised by the ESF Research Networking Programme PSE (The Philosophy of Science in a European Perspective) in 2009. Five general topics are addressed: 1. Formal Methods in the Philosophy of Science; 2. Philosophy of the Natural and Life Sciences; 3. Philosophy of the Cultural and Social Sciences; 4. Philosophy of the Physical Sciences; 5. History of t…Read more
  •  124
    The Weight of Competence under a Realistic Loss Function
    Logic Journal of the IGPL 18 (2): 346-352. 2010.
    In many scientific, economic and policy-related problems, pieces of information from different sources have to be aggregated. Typically, the sources are not equally competent. This raises the question of how the relative weights and competences should be related to arrive at an optimal final verdict. Our paper addresses this question under a more realistic perspective of measuring the practical loss implied by an inaccurate verdict.
  •  46
    A federal assembly consists of a number of representatives for each of the nations (states, Länder, cantons,...) that make up the federation. How many representatives should each nation receive? What makes this issue worth quibbling about is that the model of representation that is instituted will have an impact on the welfare distribution over the nations in the federation that will ensue over due course. We will investigate what models of representation yield welfare distributions that score h…Read more
  •  32
    Bonjour (1985: 101 and 1999: 124) and other coherence theorists of justification before him (e.g. Ewing, 1934: 246) have complained that we do not have a satisfactory analysis of the notion of coherence. The problem with existing accounts of coherence is that they try to bring precision to our intuitive notion of coherence independently of the particular role that it is meant to play within the coherence theory of justification (e.g Lewis, 1946: 338). This is a mistake: it does not make any …Read more
  •  66
    Transdisziplinarität – eine Herausforderung für die Wissenschaftstheorie
    In Gereon Wolters & Martin Carrier (eds.), Homo Sapiens und Homo Faber, De Gruyter. pp. 335--343. 2005.
    Die zeitgenössische Wissenschaftstheorie leidet unter ähnlichen Problemen wie die Wissenschaften, mit denen sie sich befasst. So nimmt auch in der Wissenschaftstheorie die Spezialisierung stark zu, und bei vielen der behandelten Fragestellungen geht es einzig um Detailprobleme, die sich aus einem sich verselbständigenden Diskussionszusammenhang entwickelt haben, wobei der Bezug zur jeweiligen Ausgangsfrage und die größere philosophische Perspektive leicht aus den Augen verloren geht.
  •  170
    We construct a probabilistic coherence measure for information sets which determines a partial coherence ordering. This measure is applied in constructing a criterion for expanding our beliefs in the face of new information. A number of idealizations are being made which can be relaxed by an appeal to Bayesian Networks.
  •  146
    On Correspondence
    Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics 33 (1): 79-94. 2002.
    This paper is an essay review of Steven French and Harmke Kamminga (eds.), Correspondence, Invariance and Heuristics. Essays in Honour of Heinz Post (Dordrecht: Kluwer, 1993). I distinguish a varity of correspondence relations between scientific theories (exemplified by cases from the book under review) and examine how one can make sense of the the prevailing continuity in scientific theorizing.
  •  137
    Welfarism and the Assessment of Social Decision Rules
    In Jerome Lang & Ulle Endriss (eds.), Computational Social Choice 2006, University of Amsterdam. 2006.
    The choice of a social decision rule for a federal assembly affects the welfare distribution within the federation. But which decision rules can be recommended on welfarist grounds? In this paper, we focus on two welfarist desiderata, viz. (i) maximizing the expected utility of the whole federation and (ii) equalizing the expected utilities of people from different states in the federation. We consider the European Union as an example, set up a probabilistic model of decision making and explore h…Read more
  •  50
    Models, Simulations, and the Reduction of Complexity (edited book)
    with Ulrich Gähde and Jörn Henning Wolf
    De Gruyter. 2013.
    Modern science is, to a large extent, a model-building activity. But how are models contructed? How are they related to theories and data? How do they explain complex scientific phenomena, and which role do computer simulations play here? These questions have kept philosophers of science busy for many years, and much work has been done to identify modeling as the central activity of theoretical science. At the same time, these questions have been addressed by methodologically-minded scientists, …Read more
  •  121
    Mechanisms, Coherence, and Theory Choice in the Cognitive Neurosciences
    In Peter McLaughlin, Peter Machamer & Rick Grush (eds.), Theory and Method in the Neurosciences, Pittsburgh University Press. pp. 70-80. 2001.
    Let me first state that I like Antti Revonsuo’s discussion of the various methodological and interpretational problems in neuroscience. It shows how careful and methodologically reflected scientists have to proceed in this fascinating field of research. I have nothing to add here. Furthermore, I am very sympathetic towards Revonsuo’s general proposal to call for a Philosophy of Neuroscience that stresses foundational issues, but also focuses on methodological and explanatory strategies.2 I…Read more
  •  174
    Understanding (With) Toy Models
    with Alexander Reutlinger and Dominik Hangleiter
    British Journal for the Philosophy of Science. 2016.
    Toy models are highly idealized and extremely simple models. Although they are omnipresent across scientific disciplines, toy models are a surprisingly under-appreciated subject in the philosophy of science. The main philosophical puzzle regarding toy models is that it is an unsettled question what the epistemic goal of toy modeling is. One promising proposal for answering this question is the claim that the epistemic goal of toy models is to provide individual scientists with understanding. The…Read more
  •  166
    Review of Inference to the Best Explanation
    with Lefteris Farmakis
    Notre Dame Philosophical Reviews 1 (6). 2005.
    The first edition of Peter Lipton's Inference to the Best Explanation, which appeared in 1991, is a modern classic in the philosophy of science. Yet in the second edition of the book, Lipton proves that even a classic can be improved. Not only does Lipton elaborate and expand on the themes covered in the first edition, but he also adds a new chapter on Bayesianism. In particular, he attempts a reconciliation between the Bayesian approach and that offered by Inference to the Best Explanation (IBE…Read more
  •  323
    The aggregation of consistent individual judgments on logically interconnected propositions into a collective judgment on those propositions has recently drawn much attention. Seemingly reasonable aggregation procedures, such as propositionwise majority voting, cannot ensure an equally consistent collective conclusion. The literature on judgment aggregation refers to that problem as the discursive dilemma. In this paper, we motivate that many groups do not only want to reach a factually right co…Read more
  •  55
    We provide welfarist evaluations of decision rules for federations of states and consider models, under which the interests of people from different states are stochastically dependent. We concentrate on two welfarist standards; they require that the expected utility for the federation be maximized or that the expected utilities for people from different states be equal. We discuss an analytic result that characterizes the decision rule with maximum expected utility, set up a class of models tha…Read more
  •  188
    Models, Mechanisms, and Coherence
    with Matteo Colombo and Robert van Iersel
    British Journal for the Philosophy of Science 66 (1): 181-212. 2015.
    Life-science phenomena are often explained by specifying the mechanisms that bring them about. The new mechanistic philosophers have done much to substantiate this claim and to provide us with a better understanding of what mechanisms are and how they explain. Although there is disagreement among current mechanists on various issues, they share a common core position and a seeming commitment to some form of scientific realism. But is such a commitment necessary? Is it the best way to go about me…Read more
  •  13
    Editorial
    Logic Journal of the IGPL 18 (2): 277-277. 2010.
    Social epistemology is a relatively new and booming field of research. It studies the social dimension of the pursuit of acquiring true beliefs and requires philosophical as well as sociological and economic expertise. The insights gained in social epistemology are not only of theoretical interest; they also improve our understanding of social and political processes, as the field includes the analysis of group deliberation and group decision-making. However, surprisingly little work has so far …Read more
  •  35
    Review: The Sun, the Genome and the Internet by F. Dyson (review)
    Physikalische Blätter 56. 2000.
  •  203
    In den vergangenen Jahren hat die Europäische Union (EU) wiederholt versucht, ihre Institutionen zu reformieren. Als der Entwurf für eine Europäische Verfassung und später der Vertrag von Lissabon ausgehandelt wurden, betraf einer der meistdiskutiertesten Streitpunkte die Frage, nach welcher Entscheidungsregel der EU-Ministerrat abstimmen sollte. Diese Frage ist eine genuin normative Frage. Deshalb sollten auch politische Philosophen und Ethiker etwas zu dieser Frage beitragen können. Im folgend…Read more
  •  50
    In everyday life, as well as in science, we have to deal with and act on the basis of partial (i.e. incomplete, uncertain, or even inconsistent) information. This observation is the source of a broad research activity from which a number of competing approaches have arisen. There is some disagreement concerning the way in which partial or full ignorance is and should be handled. The most successful approaches include both quantitative aspects (by means of probability theory) and qualitative aspe…Read more
  •  1
    Book Review: Luc Bovens and Stephan Hartmann "Bayesian Epistemology" (review)
    Studia Logica 81 (2): 289-292. 2005.
    Book Review of Luc Bovens and Stephan Hartmann *Bayesian Epistemology* by Erik J. Olsson
  •  13
    One of the major problems that artificial intelligence needs to tackle is the combination of different and potentially conflicting sources of information. Examples are multi-sensor fusion, database integration and expert systems development. In this paper we are interested in the aggregation of propositional logic-based information, a problem recently addressed in the literature on information fusion. It has applications in multi-agent systems that aim at aggregating the distributed agent-based …Read more
  •  152
    Combining testimonial reports from independent and partially reliable information sources is an important epistemological problem of uncertain reasoning. Within the framework of Dempster–Shafer theory, we propose a general model of partially reliable sources, which includes several previously known results as special cases. The paper reproduces these results on the basis of a comprehensive model taxonomy. This gives a number of new insights and thereby contributes to a better understanding of th…Read more
  •  316
    The Chromodielectric Soliton Model: Quark Self-Energy and Hadron Bags
    with Larry Wilets and Ping Tang
    Physical Review C 55 2067-2077. 1997.
    The chromodielectric soliton model is Lorentz and chirally invariant. It has been demonstrated to exhibit dynamical chiral symmetry breaking and spatial confinement in the locally uniform approximation. We here study the full nonlocal quark self-energy in a color-dielectric medium modeled by a two-parameter Fermi function. Here color confinement is manifest. The self-energy thus obtained is used to calculate quark wave functions in the medium which, in turn, are used to calculate the nucleon and…Read more