•  18
    David Danks. Psychological Theories of Categorizations as Probabilistic Models
  •  18
    Rate-Agnostic Structure Learning
    with Sergey Pils, Cynthia Freeman, and Vince Calhoun
    Causal structure learning from time series data is a major scientific challenge. Existing algorithms assume that measurements occur sufficiently quickly; more precisely, they assume that the system and measurement timescales are approximately equal. In many scientific domains, however, measurements occur at a significantly slower rate than the underlying system changes. Moreover, the size of the mismatch between timescales is often unknown. This paper provides three distinct causal structure lea…Read more
  •  18
    Learning by artificial intelligence systems-what I will typically call machine learning-has a distinguished history, and the field has experienced something of a renaissance in the past twenty years. Machine learning consists principally of a diverse set of algorithms and techniques that have been applied to problems in a wide range of domains. Any overview of the methods and applications will inevitably be incomplete, at least at the level of specific algorithms and techniques. There are many e…Read more
  •  17
    In many people, caffeine causes slight muscle tremors, particularly in their hands. In general, the Caffeine → Muscle Tremors causal connection is a noisy one: someone can drink coffee and experience no hand shaking, and there are many other factors that can lead to muscle tremors. Now suppose that Jane drinks several cups of coffee and then notices that her hands are trembling; an obvious question is: did this instance of coffee drinking cause this instance of hand-trembling? Structurally simil…Read more
  •  16
    It is the height of banality to observe that people, not bullets, fight kinetic wars. The machinery of kinetic warfare is obviously relevant to the conduct of each particular act of warfare, but the reasons for, and meanings of, those acts depend critically on the fact that they are done by humans. Any attempt to understand warfare—its causes, strategies, legitimacy, dynamics, and resolutions—must incorporate humans as an intrinsic part, both descriptively and normatively. Humans from general st…Read more
  •  15
    Most learning models assume, either implicitly or explicitly, that the goal of learning is to acquire a complete and veridical representation of the world, but this view assumes away the possibility that pragmatic goals can play a central role in learning. We propose instead that people are relatively frugal learners, acquiring goal-relevant information while ignoring goal-irrelevant features of the environment. Experiment 1 provides evidence that learning is goal-dependent, and that people are …Read more
  •  14
    Models based on causal capacities, or independent causal influences/mechanisms, are widespread in the sciences. This paper develops a natural mathematical framework for representing such capacities by extending and generalizing previous results in cognitive psychology and machine learning, based on observations and arguments from prior philosophical debates. In addition to its substantial generality, the resulting framework provides a theoretical unification of the widely-used noisy-OR/AND and l…Read more
  •  14
    Even if one can experiment on relevant factors, learning the causal structure of a dynamical system can be quite difficult if the relevant measurement processes occur at a much slower sampling rate than the “true” underlying dynamics. This problem is exacerbated if the degree of mismatch is unknown. This paper gives a formal characterization of this learning problem, and then provides two sets of results. First, we prove a set of theorems characterizing how causal structures change under undersa…Read more
  •  13
    Arguments, claims, and discussions about the “level of description” of a theory are ubiquitous in cognitive science. Such talk is typically expressed more precisely in terms of the granularity of the theory, or in terms of Marr’s three levels. I argue that these ways of understanding levels of description are insufficient to capture the range of different types of theoretical commitments that one can have in cognitive science. When we understand these commitments as points in a multi-dimensional…Read more
  •  13
    Research on human causal learning has largely focused on strength learning, or on computational-level theories; there are few formal algorithmic models of how people learn causal structure from covariations. We introduce a model that learns causal structure in a local manner via prediction-error learning. This local learning is then integrated dynamically into a unified representation of causal structure. The model uses computationally plausible approximations of rational learning, and so repres…Read more
  •  12
    Structure learning algorithms for graphical models have focused almost exclusively on stable environments in which the underlying generative process does not change; that is, they assume that the generating model is globally stationary. In real-world environments, however, such changes often occur without warning or signal. Real-world data often come from generating models that are only locally stationary. In this paper, we present LoSST, a novel, heuristic structure learning algorithm that trac…Read more
  •  12
    The Rescorla–Wagner model has been a leading theory of animal causal induction for nearly 30 years, and human causal induction for the past 15 years. Recent theories 367) have provided alternative explanations of how people draw causal conclusions from covariational data. However, theoretical attempts to compare the Rescorla–Wagner model with more recent models have been hampered by the fact that the Rescorla–Wagner model is an algorithmic theory, while the more recent theories are all computati…Read more
  •  10
    Mesochronal Structure Learning
    with Sergey Pils and Jianyu Yang
    Standard time series structure learning algorithms assume that the measurement timescale is approximately the same as the timescale of the underlying system. In many scientific contexts, however, this assumption is violated: the measurement timescale can be substantially slower than the system timescale. This assumption violation can lead to significant learning errors. In this paper, we provide a novel learning algorithm to extract systemtimescale structure from measurement data that undersampl…Read more
  •  4
    Building Theories: Heuristics and Hypotheses in Sciences (edited book)
    Springer International Publishing. 2018.
  •  4
    Comorbid science?
    with Stephen Fancsali, Clark Glymour, and Richard Scheines
    Behavioral and Brain Sciences 33 (2-3). 2010.
    We agree with Cramer et al.'s goal of the discovery of causal relationships, but we argue that the authors' characterization of latent variable models (as deployed for such purposes) overlooks a wealth of extant possibilities. We provide a preliminary analysis of their data, using existing algorithms for causal inference and for the specification of latent variable models
  •  2
    LPCD framework: Analytical tool or psychological model?
    Behavioral and Brain Sciences 41. 2018.
  • Richer Than Reduction
    In David Danks & Emiliano Ippoliti (eds.), Building Theories: Heuristics and Hypotheses in Sciences, Springer Verlag. 2018.
  • The Epistemology of Causal Judgment
    Dissertation, University of California, San Diego. 2001.
    We make constant use of causal beliefs in our everyday lives without giving much thought to the source of those beliefs, even for situations about which we have no specific prior causal knowledge. We can ask two distinct types of questions about these causal judgments: descriptive questions and normative questions . The primary goal of this dissertation is to apply normative research on causal judgment to our descriptive theories. ;I begin this dissertation by describing the primary results of r…Read more