•  961
    Who’s Afraid of Nagelian Reduction?
    Erkenntnis 73 (3): 393-412. 2010.
    We reconsider the Nagelian theory of reduction and argue that, contrary to a widely held view, it is the right analysis of intertheoretic reduction. The alleged difficulties of the theory either vanish upon closer inspection or turn out to be substantive philosophical questions rather than knock-down arguments.
  •  132
    This paper explores various functions of idealizations in quantum field theory. To this end it is important to first distinguish between different kinds of theories and models of or inspired by quantum field theory. Idealizations have pragmatic and cognitive functions. Analyzing a case-study from hadron physics, I demonstrate the virtues of studying highly idealized models for exploring the features of theories with an extremely rich structure such as quantum field theory and for gaining some un…Read more
  •  81
    Die Elementarteilchenphysik gilt weithin als eine Teildisziplin der Physik, die ein reduktionistisches Programm par excellence verfolgt. In dieser Arbeit soll versucht werden, unter Berücksichtigung einer Analyse neuerer Methoden der Elementarteilchenphysik, die Berechtigung dieser Behauptung zu klären. Die Reduktionismusproblemtik läßt sich in ontologische, epistemologische und methodologische Aspekte untergliedern.
  •  27
    Editorial to “Reduction and the Special Sciences”
    Erkenntnis 73 (3): 293-293. 2010.
    Science presents us with a variety of accounts of the world. While some of these accounts posit deep theoretical structure and fundamental entities, others do not. But which of these approaches is the right one? How should science conceptualize the world? And what is the relation between the various accounts? Opinions on these issues diverge wildly in philosophy of science. At one extreme are reductionists who argue that higher-level theories should, in principle, be incorporated in, or eliminat…Read more
  •  188
    Bose-Einstein-Kondensation ultrakalter Atome
    with Rainer Müller and Hartmut Wiesner
    In W. Schneider (ed.), Wege in der Physikdidaktik, Band IV, Palm & Enke. pp. 165-183. 1998.
    Am 14. Juli 1995 berichteten die angesehene Wissenschaftszeitschrift Science sowie die berühmte amerikanische Tageszeitung New York Times – auf dem Titelblatt – gleichzeitig über die erstmalige experimentelle Erzeugung eines Bose-Einstein-Kondensates aus einem Gas schwach wechselwirkender Alkaliatome am Joint Institute for Laboratory Astrophy- sics (JILA) in Boulder/Colorado (USA). Was war an dieser Leistung so bedeutsam, dass man sich entschloss, sie auf jene Weise bekannt zu geben?
  •  160
    Models as a Tool for Theory Construction: Some Strategies of Preliminary Physics
    In William Herfel, Władysław Krajewski, Ilkka Niiniluoto & Ryszard Wójcicki (eds.), Theories and Models in Scientific Processes, Rodopi. pp. 49-67. 1995.
    Theoretical models are an important tool for many aspects of scientific activity. They are used, i.a., to structure data, to apply theories or even to construct new theories. But what exactly is a model? It turns out that there is no proper definition of the term "model" that covers all these aspects. Thus, I restrict myself here to evaluate the function of models in the research process while using "model" in the loose way physicists do. To this end, I distinguish four kinds of models. These ar…Read more
  •  84
    Special issue of Synthese on Bayesian Epistemology
    with Luc Bovens
    Synthese 156 (3): 403-403. 2007.
    The papers in this collection were presented at a workshop on Bayesian Epistemology at the 26th International Wittgenstein Symposium in Kirchberg, Austria (August 4–7, 2003), at a workshop on Philosophy and Probability at the conference GAP5 in Bielefeld, Germany (September 20–22, 2003), at a workshop on Bayesian Epistemology at the Centre for Philosophy of Natural and Social Science, London School of Economics and Political Science in London, UK (June 28, 2004), or at the seminar of the res…Read more
  •  143
    Simulation
    In Jürgen Mittelstrass (ed.), Enzyklopädie Philosophie und Wissenschaftstheorie, Vol. 3, Metzler. 1995.
    Simulation (von lat. simulare, engl. simulation, franz. simulation, ital. simulazione), Bezeichnung für die Nachahmung eines Prozesses durch einen anderen Prozeß. Beide Prozesse laufen auf einem bestimmten System ab. Simuliertes u. simulierendes System (der Simulator in der Kybernetik) können dabei auf gleichen oder unterschiedlichen Substraten realisiert sein.
  •  209
    An Impossibility Result for Coherence Rankings
    with Luc Bovens
    Philosophical Studies 128 (1): 77-91. 2006.
    If we receive information from multiple independent and partially reliable information sources, then whether we are justified to believe these information items is affected by how reliable the sources are, by how well the information coheres with our background beliefs and by how internally coherent the information is. We consider the following question. Is coherence a separable determinant of our degree of belief, i.e. is it the case that the more coherent the new information is, the more justi…Read more
  •  13
    Artificial Intelligence and Its Methodological Implications
    Vienna Circle Institute Yearbook 11 217-223. 2004.
    Donald Gillies is one of the pioneers in the philosophical analysis of artificial intelligence. In his recent book, Gillies not only makes a new and rapidly developing field of science accessible to philosophers; he also introduces philosophical topics relevant to researchers in AI and thereby helps establish a dialogue between the two disciplines. His book clearly and convincingly demonstrates the fruitful interplay between AI and philosophy of science.
  •  71
    Merging Judgments and the Problem of Truth-Tracking
    In Jerome Lang & Ulle Endriss (eds.), Computational Social Choice 2006, University of Amsterdam. 2006.
    The problem of the aggregation of consistent individual judgments on logically interconnected propositions into a collective judgment on the same propositions has recently drawn much attention. The dificulty lies in the fact that a seemingly reasonable aggregation procedure, such as propositionwise majority voting, cannot ensure an equally consistent collective outcome. The literature on judgment aggregation refers to such dilemmas as the discursive paradox. So far, three procedures have been pr…Read more
  •  463
    Normativität und Bayesianismus
    In Bernward Gesang (ed.), Deskriptive oder normative Wissenschaftstheorie, Ontos-verlag. pp. 177-204. 2004.
    Das Thema dieses Bandes ist die Frage, ob die Wissenschaftstheorie eine normative Disziplin ist. Zunächst überrascht die Frage, denn für viele Wissenschaftstheoretiker ist die Antwort ein klares „Ja“; sie halten es für einen Allgemeinplatz, dass die Wissenschaftstheorie ein normatives Unternehmen ist. Bei genauerem Hinsehen stellt sich jedoch heraus, dass die Frage unterschiedliche Interpretationen zulässt, die einzeln diskutiert werden müssen. Dies geschieht im ersten Abschnitt. Im zweite…Read more
  •  67
    Models and Simulations
    Synthese 169 (3). 2009.
    Special issue. With contributions by Anouk Barberouse, Sarah Francescelli and Cyrille Imbert, Robert Batterman, Roman Frigg and Julian Reiss, Axel Gelfert, Till Grüne-Yanoff, Paul Humphreys, James Mattingly and Walter Warwick, Matthew Parker, Wendy Parker, Dirk Schlimm, and Eric Winsberg.
  •  19
    Kopenhagen contra Bohm – eine Herausforderung für den Realismus?
    with Rainer Müller
    Praxis der Naturwissenschaften - Physik 4 12-17. 1999.
    Der bedeutende amerikanische Logiker und Philosoph W.V.O. Quine hat die folgende Frage ins Zentrum seines Schaffens gestellt: "Wie kommen wir von unseren Sinnesdaten zu Theorien über die Welt?“ Bei der Beantwortung dieser Frage tritt ein grundlegendes Problem auf, das damit zusammenhängt, dass uns immer nur ein endlicher Satz an Informationen über die Welt zugänglich ist. Jedes Experiment liefert z. B. nur eine endliche Anzahl von Messpunkten.
  •  697
    Bayesian Epistemology
    In DancyJ (ed.), A Companion to Epistemology, Blackwell. 2010.
    Bayesianism is our leading theory of uncertainty. Epistemology is defined as the theory of knowledge. So “Bayesian Epistemology” may sound like an oxymoron. Bayesianism, after all, studies the properties and dynamics of degrees of belief, understood to be probabilities. Traditional epistemology, on the other hand, places the singularly non-probabilistic notion of knowledge at centre stage, and to the extent that it traffics in belief, that notion does not come in degrees. So how can there be a B…Read more
  •  1
    Proceedings of EPSA09 (edited book)
    Springer. 2012.
    This is a collection of high-quality research papers in the philosophy of science, deriving from papers presented at the second meeting of the European Philosophy of Science Association in Amsterdam, October 2009.
  •  27
    Explanation, Prediction, and Confirmation (edited book)
    with Marcel Weber, Wenceslao Gonzalez, Dennis Dieks, and Thomas Uebe
    Springer. 2011.
    This volume, the second in the Springer series Philosophy of Science in a European Perspective, contains selected papers from the workshops organised by the ESF Research Networking Programme PSE (The Philosophy of Science in a European Perspective) in 2009. Five general topics are addressed: 1. Formal Methods in the Philosophy of Science; 2. Philosophy of the Natural and Life Sciences; 3. Philosophy of the Cultural and Social Sciences; 4. Philosophy of the Physical Sciences; 5. History of t…Read more
  •  124
    The Weight of Competence under a Realistic Loss Function
    Logic Journal of the IGPL 18 (2): 346-352. 2010.
    In many scientific, economic and policy-related problems, pieces of information from different sources have to be aggregated. Typically, the sources are not equally competent. This raises the question of how the relative weights and competences should be related to arrive at an optimal final verdict. Our paper addresses this question under a more realistic perspective of measuring the practical loss implied by an inaccurate verdict.
  •  46
    A federal assembly consists of a number of representatives for each of the nations (states, Länder, cantons,...) that make up the federation. How many representatives should each nation receive? What makes this issue worth quibbling about is that the model of representation that is instituted will have an impact on the welfare distribution over the nations in the federation that will ensue over due course. We will investigate what models of representation yield welfare distributions that score h…Read more
  •  66
    Transdisziplinarität – eine Herausforderung für die Wissenschaftstheorie
    In Gereon Wolters & Martin Carrier (eds.), Homo Sapiens und Homo Faber, De Gruyter. pp. 335--343. 2005.
    Die zeitgenössische Wissenschaftstheorie leidet unter ähnlichen Problemen wie die Wissenschaften, mit denen sie sich befasst. So nimmt auch in der Wissenschaftstheorie die Spezialisierung stark zu, und bei vielen der behandelten Fragestellungen geht es einzig um Detailprobleme, die sich aus einem sich verselbständigenden Diskussionszusammenhang entwickelt haben, wobei der Bezug zur jeweiligen Ausgangsfrage und die größere philosophische Perspektive leicht aus den Augen verloren geht.
  •  167
    We construct a probabilistic coherence measure for information sets which determines a partial coherence ordering. This measure is applied in constructing a criterion for expanding our beliefs in the face of new information. A number of idealizations are being made which can be relaxed by an appeal to Bayesian Networks.
  •  32
    Bonjour (1985: 101 and 1999: 124) and other coherence theorists of justification before him (e.g. Ewing, 1934: 246) have complained that we do not have a satisfactory analysis of the notion of coherence. The problem with existing accounts of coherence is that they try to bring precision to our intuitive notion of coherence independently of the particular role that it is meant to play within the coherence theory of justification (e.g Lewis, 1946: 338). This is a mistake: it does not make any …Read more
  •  146
    On Correspondence
    Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics 33 (1): 79-94. 2002.
    This paper is an essay review of Steven French and Harmke Kamminga (eds.), Correspondence, Invariance and Heuristics. Essays in Honour of Heinz Post (Dordrecht: Kluwer, 1993). I distinguish a varity of correspondence relations between scientific theories (exemplified by cases from the book under review) and examine how one can make sense of the the prevailing continuity in scientific theorizing.
  •  137
    Welfarism and the Assessment of Social Decision Rules
    In Jerome Lang & Ulle Endriss (eds.), Computational Social Choice 2006, University of Amsterdam. 2006.
    The choice of a social decision rule for a federal assembly affects the welfare distribution within the federation. But which decision rules can be recommended on welfarist grounds? In this paper, we focus on two welfarist desiderata, viz. (i) maximizing the expected utility of the whole federation and (ii) equalizing the expected utilities of people from different states in the federation. We consider the European Union as an example, set up a probabilistic model of decision making and explore h…Read more
  •  50
    Models, Simulations, and the Reduction of Complexity (edited book)
    with Ulrich Gähde and Jörn Henning Wolf
    De Gruyter. 2013.
    Modern science is, to a large extent, a model-building activity. But how are models contructed? How are they related to theories and data? How do they explain complex scientific phenomena, and which role do computer simulations play here? These questions have kept philosophers of science busy for many years, and much work has been done to identify modeling as the central activity of theoretical science. At the same time, these questions have been addressed by methodologically-minded scientists, …Read more
  •  120
    Mechanisms, Coherence, and Theory Choice in the Cognitive Neurosciences
    In Peter McLaughlin, Peter Machamer & Rick Grush (eds.), Theory and Method in the Neurosciences, Pittsburgh University Press. pp. 70-80. 2001.
    Let me first state that I like Antti Revonsuo’s discussion of the various methodological and interpretational problems in neuroscience. It shows how careful and methodologically reflected scientists have to proceed in this fascinating field of research. I have nothing to add here. Furthermore, I am very sympathetic towards Revonsuo’s general proposal to call for a Philosophy of Neuroscience that stresses foundational issues, but also focuses on methodological and explanatory strategies.2 I…Read more
  •  173
    Understanding (With) Toy Models
    with Alexander Reutlinger and Dominik Hangleiter
    British Journal for the Philosophy of Science. 2016.
    Toy models are highly idealized and extremely simple models. Although they are omnipresent across scientific disciplines, toy models are a surprisingly under-appreciated subject in the philosophy of science. The main philosophical puzzle regarding toy models is that it is an unsettled question what the epistemic goal of toy modeling is. One promising proposal for answering this question is the claim that the epistemic goal of toy models is to provide individual scientists with understanding. The…Read more
  •  166
    Review of Inference to the Best Explanation
    with Lefteris Farmakis
    Notre Dame Philosophical Reviews 1 (6). 2005.
    The first edition of Peter Lipton's Inference to the Best Explanation, which appeared in 1991, is a modern classic in the philosophy of science. Yet in the second edition of the book, Lipton proves that even a classic can be improved. Not only does Lipton elaborate and expand on the themes covered in the first edition, but he also adds a new chapter on Bayesianism. In particular, he attempts a reconciliation between the Bayesian approach and that offered by Inference to the Best Explanation (IBE…Read more
  •  323
    The aggregation of consistent individual judgments on logically interconnected propositions into a collective judgment on those propositions has recently drawn much attention. Seemingly reasonable aggregation procedures, such as propositionwise majority voting, cannot ensure an equally consistent collective conclusion. The literature on judgment aggregation refers to that problem as the discursive dilemma. In this paper, we motivate that many groups do not only want to reach a factually right co…Read more