•  1
    The Confirmation of Common Component Causes
    PSA Proceedings of the Biennial Meeting of the Philosophy of Science Association 1988 (1): 2-9. 1988.
    There is an interesting problem concerning component causes posed by Cartwright (1983) in her book How the Laws of Physics Lie, which is easily explained in terms of a simple example. Consider a cup sitting on the table. Why doesn’t it move? The explanation given by Newtonian mechanics is that the cup is experiencing two forces-the downward force of gravity and the upward ‘elastic’ force of the table-and these two forces exactly cancel to produce a zero resultant force. This zero resultant force…Read more
  •  2
    Unification and Scientific Realism Revisited
    PSA Proceedings of the Biennial Meeting of the Philosophy of Science Association 1986 (1): 394-405. 1986.
    Section 2 will begin by formulating Reichenbach’s principle of common cause in a more general way than is usual but in a way that makes the idea behind it a lot clearer. The way that Salmon has pushed the principle into the services of scientific realism will be explained in terms of an example, van Fraassen objects, Salmon modifies his stand and van Fraassen rejoins - all in section 2. (See van Fraassen 1980, chapter 2).In this episode I think van Fraassen right in claiming - against Salmon tha…Read more
  •  22
    How the Laws of Physics Lie
    Philosophy of Science 52 (3): 478-480. 1985.
  •  547
    Traditional analyses of the curve fitting problem maintain that the data do not indicate what form the fitted curve should take. Rather, this issue is said to be settled by prior probabilities, by simplicity, or by a background theory. In this paper, we describe a result due to Akaike [1973], which shows how the data can underwrite an inference concerning the curve's form based on an estimate of how predictively accurate it will be. We argue that this approach throws light on the theoretical vir…Read more
  •  236
    The Emergence of the Macroworld: A Study of Intertheory Relations in Classical and Quantum Mechanics
    with Alexey Kryukov
    Philosophy of Science 70 (5): 1039-1051. 2003.
    Classical mechanics is empirically successful because the probabilistic mean values of quantum mechanical observables follow the classical equations of motion to a good approximation (Messiah 1970, 215). We examine this claim for the one-dimensional motion of a particle in a box, and extend the idea by deriving a special case of the ideal gas law in terms of the mean value of a generalized force used to define "pressure." The examples illustrate the importance of probabilistic averaging as a met…Read more
  •  68
    Model selection in science: The problem of language variance
    British Journal for the Philosophy of Science 50 (1): 83-102. 1999.
    Recent solutions to the curve-fitting problem, described in Forster and Sober ([1995]), trade off the simplicity and fit of hypotheses by defining simplicity as the paucity of adjustable parameters. Scott De Vito ([1997]) charges that these solutions are 'conventional' because he thinks that the number of adjustable parameters may change when the hypotheses are described differently. This he believes is exactly what is illustrated in Goodman's new riddle of induction, otherwise known as the grue…Read more
  •  80
    The Frugal Inference of Causal Relations
    with Garvesh Raskutti, Reuben Stern, and Naftali Weinberger
    British Journal for the Philosophy of Science 69 (3): 821-848. 2018.
    Recent approaches to causal modelling rely upon the causal Markov condition, which specifies which probability distributions are compatible with a directed acyclic graph. Further principles are required in order to choose among the large number of DAGs compatible with a given probability distribution. Here we present a principle that we call frugality. This principle tells one to choose the DAG with the fewest causal arrows. We argue that frugality has several desirable properties compared to th…Read more
  •  45
    Kenneth Wilson won the Nobel Prize in Physics in 1982 for applying renormalization group, which he learnt from quantum field theory (QFT), to problems in statistical physics—the induced magnetization of materials (ferromagnetism) and the evaporation and condensation of fluids (phase transitions). See Wilson (1983). The renormalization group got its name from its early applications in QFT. There, it appeared to be a rather ad hoc method of subtracting away unwanted infinities. The further allegat…Read more
  •  10
    Ellery Eells, 1953-2006
    Proceedings and Addresses of the American Philosophical Association 80 (2). 2006.
  •  62
    Deductive logic is about the validity of arguments. An argument is valid when its conclusion follows deductively from its premises. Here’s an example: If Alice is guilty then Bob is guilty, and Alice is guilty. Therefore, Bob is guilty. The validity of the argument has nothing to do with what the argument is about. It has nothing to do with the meaning, or content, of the argument beyond the meaning of logical phrases such as if…then. Thus, any argument of the following form (called modus ponens…Read more
  •  24
    Wayne Myrvold (2003) has captured an important feature of unified theories, and he has done so in Bayesian terms. What is not clear is whether the virtue of such unification is most clearly understood in terms of Bayesian confirmation. I argue that the virtue of such unification is better understood in terms of other truth-related virtues such as predictive accuracy.
  •  70
    What is induction? John Stuart Mill (1874, p. 208) defined induction as the operation of discovering and proving general propositions. William Whewell (in Butts, 1989, p. 266) agrees with Mill’s definition as far as it goes. Is Whewell therefore assenting to the standard concept of induction, which talks of inferring a generalization of the form “All As are Bs” from the premise that “All observed As are Bs”? Does Whewell agree, to use Mill’s example, that inferring “All humans are mortal” from t…Read more
  •  42
    Sober (1984) has considered the problem of determining the evidential support, in terms of likelihood, for a hypothesis that is incomplete in the sense of not providing a unique probability function over the event space in its domain. Causal hypotheses are typically like this because they do not specify the probability of their initial conditions. Sober's (1984) solution to this problem does not work, as will be shown by examining his own biological examples of common cause explanation. The prop…Read more
  •  55
    The paper provides a formal proof that efficient estimates of parameters, which vary as as little as possible when measurements are repeated, may be expected to provide more accurate predictions. The definition of predictive accuracy is motivated by the work of Akaike (1973). Surprisingly, the same explanation provides a novel solution for a well known problem for standard theories of scientific confirmation — the Ravens Paradox. This is significant in light of the fact that standard Bayesian an…Read more
  •  22
    The distinction itself is best explained as follows. At the empirical level (at the bottom), there are curves, or functions, or laws, such as PV = constant the Boyle’s example, or a = M/r 2 in Newton’s example. The first point is that such formulae are actually ambiguous as to the hypotheses they represent. They can be understood in two ways. In order to make this point clear, let me first introduce a terminological distinction between variables and parameters. Acceleration and distance (a and r…Read more
  •  21
    Whewell, William (b Lancaster, England, 24 May 1794; d Cambridge, England, 6 March 1866) Born the eldest son of a carpenter, William Whewell rose to become Master of Trinity College, Cambridge and a central figure in Victorian science. After attending the grammar school at Heversham in Westmorland, Whewell entered Trinity College, Cambridge and graduated Second Wrangler. He became a Fellow of the College in 1817, took his M.A. degree in 1819, and his D.D. degree in 1844.
  •  27
    A and B in signaling games (Lewis 1969). Members of the population, such as our prehistoric pair, are occasionally faced with the following ‘game’. Let one of the players be the receiver and the other the sender. The receiver needs to know whether B is true or not, but only possesses information about whether A is true or not. In some environmental contexts, A is sufficient for B, in others it is not. The sender knows nothing about A or B, but does know that A is sufficient for B in some environ…Read more
  •  158
    Although in every inductive inference, an act of invention is requisite, the act soon slips out of notice. Although we bind together facts by superinducing upon them a new Conception, this Conception, once introduced and applied, is looked upon as inseparably connected with the facts, and necessarily implied in them. Having once had the phenomena bound together in their minds in virtue of the Conception men can no longer easily restore them back to the detached and incoherent condition in which …Read more
  •  94
    How do simple rules `fit to reality' in a complex world?
    Minds and Machines 9 (4): 543-564. 1999.
    The theory of fast and frugal heuristics, developed in a new book called Simple Heuristics that make Us Smart (Gigerenzer, Todd, and the ABC Research Group, in press), includes two requirements for rational decision making. One is that decision rules are bounded in their rationality –- that rules are frugal in what they take into account, and therefore fast in their operation. The second is that the rules are ecologically adapted to the environment, which means that they `fit to reality.' The ma…Read more
  •  134
    Ramsey, Stick and Garon (1991) argue that if the correct theory of mind is some parallel distributed processing theory, then folk psychology must be false. Their idea is that if the nodes and connections that encode one representation are causally active then all representations encoded by the same set of nodes and connections are also causally active. We present a clear, and concrete, counterexample to RSG's argument. In conclusion, we suggest that folk psychology and connectionism are best und…Read more
  •  33
    The Value of Good Illustrative Examples: In order to speak as generally as possible about science, philosophers of science have traditionally formulated their theses in terms of elementary logic and elementary probability theory. They often point to real scientific examples without explaining them in detail and/or use artificial examples that fail to fit with intricacies of real examples. Sometimes their illustrative examples are chosen to fit their framework, rather than the science. Frequently…Read more
  •  15
    Suppose that the true structural equation is Y = X + U, where U is n(0,1), X is n(0,1), and X and U µ be the mean of X, y µ the mean of Y, x σ the standard deviation of are independent. Now let x..
  •  152
    Predictive accuracy as an achievable goal of science
    Proceedings of the Philosophy of Science Association 2002 (3). 2002.
    What has science actually achieved? A theory of achievement should define what has been achieved, describe the means or methods used in science, and explain how such methods lead to such achievements. Predictive accuracy is one truth‐related achievement of science, and there is an explanation of why common scientific practices tend to increase predictive accuracy. Akaike’s explanation for the success of AIC is limited to interpolative predictive accuracy. But therein lies the strength of the gen…Read more
  •  176
    Counterexamples to a likelihood theory of evidence
    Minds and Machines 16 (3): 319-338. 2006.
    The likelihood theory of evidence (LTE) says, roughly, that all the information relevant to the bearing of data on hypotheses (or models) is contained in the likelihoods. There exist counterexamples in which one can tell which of two hypotheses is true from the full data, but not from the likelihoods alone. These examples suggest that some forms of scientific reasoning, such as the consilience of inductions (Whewell, 1858. In Novum organon renovatum (Part II of the 3rd ed.). The philosophy of th…Read more
  •  141
    A Philosopher’s Guide to Empirical Success
    Philosophy of Science 74 (5): 588-600. 2007.
    The simple question, what is empirical success? turns out to have a surprisingly complicated answer. We need to distinguish between meritorious fit and ‘fudged fit', which is akin to the distinction between prediction and accommodation. The final proposal is that empirical success emerges in a theory dependent way from the agreement of independent measurements of theoretically postulated quantities. Implications for realism and Bayesianism are discussed. ‡This paper was written when I was a visi…Read more
  •  28
    Scientific discovery is often regarded as romantic and creative - and hence unanalyzable - whereas the everyday process of verifying discoveries is sober and more suited to analysis. Yet this fascinating exploration of how scientific work proceeds argues that however sudden the moment of discovery may seem, the discovery process can be described and modeled. Using the methods and concepts of contemporary information-processing psychology (or cognitive science) the authors develop a series of art…Read more
  •  80
    The golfer's dilemma: A reply to Kukla on curve-fitting
    British Journal for the Philosophy of Science 46 (3): 348-360. 1995.
    Curve-fitting typically works by trading off goodness-of-fit with simplicity, where simplicity is measured by the number of adjustable parameters. However, such methods cannot be applied in an unrestricted way. I discuss one such correction, and explain why the exception arises. The same kind of probabilistic explanation offers a surprising resolution to a common-sense dilemma.