1. 29915.876869
    In March, I’ll be talking at Spencer Breiner‘s workshop on Applied Category Theory at the National Institute of Standards and Technology. I’ll be giving a joint talk with John Foley about our work using operads to design networks. …
    Found 8 hours, 18 minutes ago on Azimuth
  2. 69820.876932
    . As part of the week of recognizing R.A.Fisher (February 17, 1890 – July 29, 1962), I reblog a guest post by Stephen Senn from 2012/2017. The comments from 2017 lead to a troubling issue that I will bring up in the comments today. …
    Found 19 hours, 23 minutes ago on D. G. Mayo's blog
  3. 196873.876954
    The distribution of matter in our universe is strikingly time asymmetric. Most famously, the Second Law of Thermodynamics says that entropy tends to increase toward the future but not toward the past. But what explains this time-asymmetric distribution of matter? In this paper, I explore the idea that time itself has a direction by drawing from recent work on grounding and metaphysical fundamentality. I will argue that positing such a direction of time, in addition to time-asymmetric boundary conditions (such as the so-called “past hypothesis”), enables a better explanation of the thermodynamic asymmetry than is available otherwise.
    Found 2 days, 6 hours ago on PhilPapers
  4. 264714.876968
    There are two standard responses to the discrepancy between observed galactic rotation curves and the theoretical curves calculated on the basis of luminous matter: postulate dark matter, or modify gravity. Most physicists accept the former as part of the concordance model of cosmology; the latter encompasses a family of proposals, of which MOND is perhaps the best-known example. Don Saari, however, claims to have found a third alternative: to explain this discrepancy as a result of approximation methods which are unfaithful to the underlying Newtonian dynamics. If he is correct, eliminating the problematic approximations should allow physicists and astronomers to preserve the validity of Newtonian dynamics in galactic systems without invoking dark matter.
    Found 3 days, 1 hour ago on PhilSci Archive
  5. 264735.876982
    We defend the many-worlds interpretation of quantum mechanics (MWI) against the objection that it cannot explain why measurement outcomes are predicted by the Born probability rule. We understand quantum probabilities in terms of an observer’s self-location probabilities. We formulate a probability postulate for the MWI: the probability of self-location in a world with a given set of outcomes is the absolute square of that world’s amplitude. We provide a proof of this postulate, which assumes the quantum formalism and two principles concerning symmetry and locality. We also show how a structurally similar proof of the Born rule is available for collapse theories. We conclude by comparing our account to the recent account offered by Sebens and Carroll.
    Found 3 days, 1 hour ago on PhilSci Archive
  6. 369848.876996
    Use of ‘representation’ pervades the literature in cognitive science? But, do representations actually play a role in cognitive-scientific explanation, or is such talk merely colorful commentary? Are, for instance, patterns of cortical activity in motion-sensitive visual area MT or strings of symbols in a language-processing parser genuine representations? Do they have content? And if they do, can a naturalist assign such contents in a well-motivated and satisfying way?
    Found 4 days, 6 hours ago on PhilPapers
  7. 437707.877009
    Modern medicine is often said to have originated with 19th century germ theory, which attributed diseases to particular bacterial contagions. The success of this theory is often associated with an underlying principle referred to as the “doctrine of specific etiology,” which refers to the theory’s specificity at the level of disease causation or etiology. Despite the perceived importance of this doctrine the literature lacks a clear account of the types of specificity it involves and why exactly they matter. This paper argues that the 19th century germ theory model involves two types of specificity at the level of etiology. One type receives significant attention in the literature, but its influence on modern medicine has been misunderstood. A second type is present in this model, but it has been overlooked in the extant literature. My analysis clarifies how these types of specificity led to a novel conception of etiology, which continues to figure in medicine today.
    Found 5 days, 1 hour ago on PhilSci Archive
  8. 503077.877023
    Humean accounts of natural lawhood (such as Lewis’s) have often been criticized as unable to account for the laws’ characteristic explanatory power in science. Loewer (Philos Stud 160:115–137, 2012) has replied that these criticisms fail to distinguish grounding explanations from scientific explanations. Lange (Philos Stud 164:255–261, 2013) has replied by arguing that grounding explanations and scientific explanations are linked by a transitivity principle, which can be used to argue that Humean accounts of natural law violate the prohibition on self-explanation. Lange’s argument has been sharply criticized by Hicks and van Elswyk (Philos Stud 172:433– 443, 2015), Marshall (Philos Stud 172:3145–3165, 2015), and Miller (Philos Stud 172:1311–1332, 2015). This paper shows how Lange’s argument can withstand these criticisms once the transitivity principle and the prohibition on self-explanation are properly refined. The transitivity principle should be refined to accommodate contrasts in the explanans and explanandum. The prohibition on self-explanation should be refined so that it precludes a given fact p from helping to explain why some other fact q helps to explain why p. In this way, the transitivity principle avoids having counterintuitive consequences in cases involving macrostates having multiple possible microrealizations. The transitivity principle is perfectly compatible with the irreducibility of macroexplanations to microexplanations and with the diversity of the relations that can underwrite scientific explanations.
    Found 5 days, 19 hours ago on Marc Lange's site
  9. 542745.877038
    I argue for patternism, a new answer to the question of when some objects compose a whole. None of the standard principles of composition comfortably capture our natural judgments, such as that my cat exists and my table exists, but there is nothing wholly composed of them. Patternism holds, very roughly, that some things compose a whole whenever together they form a “real pattern”. Plausibly we are inclined to acknowledge the existence of my cat and my table but not of their fusion, because the first two have a kind of internal organizational coherence that their putative fusion lacks. Kolmogorov complexity theory supplies the needed rigorous sense of “internal organizational coherence”.
    Found 6 days, 6 hours ago on PhilPapers
  10. 542865.877052
    Optogenetic techniques are described as “revolutionary” for the unprecedented causal control they allow neuroscientists to exert over neural activity in awake behaving animals. In this paper, I demonstrate by means of a case study that optogenetic techniques will only illuminate causal links between the brain and behavior to the extent that their error characteristics are known and, further, that determining these error characteristics requires (1) comparison of optogenetic techniques with techniques having well known error characteristics (methodological pluralism) and (2) consideration of the broader neural and behavioral context in which the targets of optogenetic interventions are situated (perspectival pluralism).
    Found 6 days, 6 hours ago on PhilPapers
  11. 544301.877065
    Comparativism is the position that the fundamental doxastic state consists in comparative beliefs (e.g., believing p to be more likely than q), with partial beliefs (e.g., believing p to degree x) being grounded in and explained by patterns amongst comparative beliefs that exist under special conditions. In this paper, I develop a version of comparativism that originates with a suggestion made by Frank Ramsey in his ‘Probability and Partial Belief’ (1929). By means of a representation theorem, I show how this ‘Ramseyan comparativism’ can be used to weaken the (unrealistically strong) conditions required for probabilistic coherence that comparativists usually rely on, while still preserving enough structure to let us retain the usual comparativists’ account of quantitative doxastic comparisons.
    Found 6 days, 7 hours ago on Edward Elliott's site
  12. 552999.87708
    A number of naturalistic philosophers of mind endorse a realist attitude towards the results of Bayesian cognitive science. This realist attitude is currently unwarranted, however. It is not obvious that Bayesian models possess special epistemic virtues over alternative models of mental phenomena involving uncertainty. In particular, the Bayesian approach in cognitive science is not more simple, unifying and rational than alternative approaches; and it not obvious that the Bayesian approach is more empirically adequate than alternatives. It is at least premature, then, to assert that mental phenomena involving uncertainty are best explained within the Bayesian approach. To continue on with an exclusive praise for Bayes would be dangerous as it risks monopolizing the center of attention, leading to the neglect of different but promising formal approaches. Naturalistic philosophers of mind would be wise instead to endorse an agnostic, instrumentalist attitude towards Bayesian cognitive science to correct their mistake.
    Found 6 days, 9 hours ago on PhilSci Archive
  13. 553017.877093
    The ontic conception of explanation, according to which explanations are "full-bodied things in the world," is fundamentally misguided. I argue instead for what I call the eikonic conception, according to which explanations are the product of an epistemic activity involving representations of the phenomena to be explained. What is explained in the first instance is a particular conceptualization of the explanandum phenomenon, contextualized within a given research program or explanatory project. I conclude that this eikonic conception has a number of benefits, including making better sense of scientific practice and allowing for the full range of normative constraints on explanation.
    Found 6 days, 9 hours ago on PhilSci Archive
  14. 555378.877106
    People often talk about the synchronic Dutch Book argument for Probabilism and the diachronic Dutch Strategy argument for Conditionalization. But the synchronic Dutch Book argument for the Principal Principle is mentioned less. …
    Found 6 days, 10 hours ago on M-Phi
  15. 592043.87712
    [The following is a guest post by Bob Lockie. — JS]He who says that all things happen of necessity can hardly find fault with one who denies that all happens by necessity; for on his own theory this very argument is voiced by necessity (Epicurus 1964: XL).Lockie, Robert. …
    Found 6 days, 20 hours ago on The Brains Blog
  16. 723663.877135
    Now students in the Applied Category Theory 2018 school are reading about categories applied to linguistics. Read the blog article here for more: • Jade Master and Cory Griffith, Linguistics using category theory, The n-Category Café, 6 February 2018. …
    Found 1 week, 1 day ago on Azimuth
  17. 725980.877148
    In the spirit of explanatory pluralism, this chapter argues that causal and noncausal explanations of a phenomenon are compatible, each being useful for bringing out different sorts of insights. After reviewing a model-based account of scientific explanation, which can accommodate causal and noncausal explanations alike, an important core conception of noncausal explanation is identified. This noncausal form of model-based explanation is illustrated using the example of how Earth scientists in a subfield known as aeolian geomorphology are explaining the formation of regularly-spaced sand ripples. The chapter concludes that even when it comes to everyday "medium-sized dry goods" such as sand ripples, where there is a complete causal story to be told, one can find examples of noncausal scientific explanations.
    Found 1 week, 1 day ago on PhilSci Archive
  18. 726028.877161
    In my book Understanding Scientific Progress (Maxwell 2017), I argue that fundamental philosophical problems about scientific progress, above all the problem of induction, cannot be solved granted standard empiricism (SE), a doctrine which most scientists and philosophers of science take for granted. A key tenet of SE is that no permanent thesis about the world can be accepted as a part of scientific knowledge independent of evidence. For a number of reasons, we need to adopt a rather different conception of science which I call aim-oriented empiricism (AOE). This holds that we need to construe physics as accepting, as a part of theoretical scientific knowledge, a hierarchy of metaphysical theses about the comprehensibility and knowability of the universe, these theses becoming increasingly insubstantial as we go up the hierarchy. Fundamental philosophical problems about scientific progress, including the problems of induction, theory unity, verisimilitude and scientific discovery, which cannot be solved granted SE, can be solved granted AOE.
    Found 1 week, 1 day ago on PhilSci Archive
  19. 898847.877175
    This contribution is devoted to addressing the question as to whether the methodology followed in building/assessing string theory can be considered scientific – in the same sense, say, that the methodology followed in building/assessing the Standard Model of particle physics is scientific – by fo-cussing on the ”founding” period of the theory. More precisely, its aim is to argue for a positive answer to the above question in the light of a historical analysis of the early developments of the string theoretical framework. The paper’s main claim is a simple one: there is no real change of scientific status in the way of proceeding and reasoning in fundamental physical research. Looking at the developments of quantum field theory and string theory since their very beginning, one sees the very same strategies at work both in theory building and theory assessment. Indeed, as the history of string theory clearly shows (see Cappelli et al., 2012), the methodology characterising the theoretical process leading to the string idea and its successive developments is not significantly different from the one characterising many fundamental developments in theoretical physics which have been crowned with successful empirical confirmation afterwards (sometimes after a considerable number of years, as exemplified by the story of the Higgs particle).
    Found 1 week, 3 days ago on PhilSci Archive
  20. 898889.877188
    This paper demonstrates that nonmechanistic, dynamical explanations are a viable approach to explanation in the special sciences. The claim that dynamical models can be explanatory without reference to mechanisms has previously been met with three lines of criticism from mechanists: the causal relevance concern, the genuine laws concern, and the charge of predictivism. I argue, however, that these mechanist criticisms fail to defeat nonmechanistic, dynamical explanation. Using the examples of Haken et al.’s ([1985]) HKB model of bimanual coordination, and Thelen et al.’s ([2001]) dynamical field model of infant perseverative reaching, I show how each mechanist criticism fails once the standards of Woodward’s ([2003]) interventionist framework are applied to dynamical models. An even-handed application of Woodwardian interventionism reveals that dynamical models are capable of producing genuine explanations without appealing to underlying mechanistic details.
    Found 1 week, 3 days ago on PhilSci Archive
  21. 898974.877204
    Three arguments against universally regular probabilities have been posed based on examples where, if regularity holds, then perfectly similar events must have different probabilities. Howson (2017) and Benci et al. (2016) have raised technical objections to these symmetry arguments, but their objections fail. Howson says that Williamson’s (2007) “isomorphic” events are not in fact isomorphic, but Howson is speaking of set-theoretic representations of events in a probability model. While those sets are not isomorphic, Williamson’s physical events are, in the relevant sense. Benci et al. claim that all three arguments rest on a conflation of different models, but they do not. They are founded on the premise that similar events should have the same probability in the same model, or in one case, on the assumption that a single rotation-invariant distribution is possible. Having failed to refute the symmetry arguments on such technical grounds, one could deny their implicit premises, which is a heavy cost, or adopt varying degrees of instrumentalism or pluralism about regularity, but that would not serve the project of accurately modelling chances.
    Found 1 week, 3 days ago on PhilSci Archive
  22. 956179.877219
    Antoine Le Grand (1629–1699) was a philosopher and catholic theologian who played an important role in propagating the Cartesian philosophy in England during the latter half of the seventeenth century. He was born in Douai, (at the time under rule by the Spanish Hapsburgs), and early in life was associated with an English community of Franciscans who had a college there. Le Grand became a Franciscan Recollect friar prior to leaving for England as a missionary in 1656. In England, he taught philosophy and theology, advocating Catholicism and eventually Cartesianism, the latter being as unpopular as the former was perilous.
    Found 1 week, 4 days ago on Stanford Encyclopedia of Philosophy
  23. 956227.877232
    The Kochen-Specker theorem is an important and subtle topic in the foundations of quantum mechanics (QM). The theorem demonstrates the impossibility of a certain type of interpretation of QM in terms of hidden variables (HV) that naturally suggests itself when one begins to consider the project of interpretating QM.We here present the theorem/argument and the foundational discussion surrounding it at different levels. The reader looking for a quick overview should read the following sections and subsections: 1, 2, 3.1, 3.2, 4, and 6. Those who read the whole entry will find proofs of some non-trivial claims in supplementary documents.
    Found 1 week, 4 days ago on Stanford Encyclopedia of Philosophy
  24. 956269.877245
    Newton’s success sharpened our understanding of the nature of space and time in the XVII century. Einstein’s special and general relativity improved this understanding in the XX century. Quantum gravity is expected to take a step further, deepening our understanding of space and time, by grasping of the implications for space and time of the quantum nature of the physical world. The best way to see what happens to space and time when their quantum traits cannot be disregarded is to look how this actually happens in a concrete theory of quantum gravity. Loop Quantum Gravity (LQG) [1–7] is among the few current theories sufficiently developed to provide a complete and clear-cut answer to this question. Here I discuss the role(s) that space and time play in LQG and the version of these notions required to make sense of a quantum gravitational world. For a detailed discussion, see the first part of the book [ ]. A brief summary of the structure of LQG is given in the Appendix, for the reader unfamiliar with this theory.
    Found 1 week, 4 days ago on PhilSci Archive
  25. 956298.877258
    Kant’s doctrine of transcendental idealism, as put forth in the first Critique, is best understood as a conceptual or epistemic doctrine. However critics of the conceptual understanding of transcendental idealism argue that it amounts to an arbitrary stipulation and that it does not do justice to the real ontological distinctions that mattered for Kant. Some stipulations are better than others, however. In this paper I argue that Kant’s doctrine, though it should be understood ‘merely epistemically’, is nevertheless full of significance and is motivated through his long-running pre-critical struggle to discover first principles for metaphysical cognition. I further argue that an epistemic understanding of the doctrine of transcendental idealism provides a Kantian with a natural way of understanding the novel epistemic situation presented to us by modern physics and in particular by quantum mechanics. And I argue that considering Kant’s philosophy in the light of the challenges posed by quantum mechanics illuminates, in return, several elements of his philosophical framework, notably the principle of causality, the doctrine of synthetic a priori principles in general, and most generally: the conceptual understanding of transcendental idealism itself. I illustrate this via an analysis of the views of the physicist Niels Bohr as well the views of the (neo-)Kantian philosopher Grete Hermann.
    Found 1 week, 4 days ago on PhilSci Archive
  26. 986740.877271
    The following general attitude to mathematics seems plausible: standard claims, such as ‘there are infinitely many primes’ or ‘every consistent set of sentences has a model’, are true; nevertheless, if one rifles through the fundamental furniture of the universe, one will not find mathematical objects, such as numbers, sets or models. A natural way of making sense of this attitude is to augment it with the following thought: this is possible because such standard claims have paraphrases that make clear that their truth does not require the fundamental existence of such objects. This paper will draw out some surprising consequences of this general approach to mathematics—an approach that I call paraphrase anti-realism. These consequences concern the relationship between logical structure, on the one hand, and explanatory structure, on the other.
    Found 1 week, 4 days ago on Bruno Whittle's site
  27. 1071890.877286
    A heated debate surrounds the significance of reproducibility as an indicator for research quality and reliability, with many commentators linking a “crisis of reproducibility” to the rise of fraudulent, careless and unreliable practices of knowledge production. Through the analysis of discourse and practices across research fields, I point out that reproducibility is not only interpreted in different ways, but also serves a variety of epistemic functions depending on the research at hand. Given such variation, I argue that the uncritical pursuit of reproducibility as an overarching epistemic value is misleading and potentially damaging to scientific advancement. Requirements for reproducibility, however they are interpreted, are one of many available means to secure reliable research outcomes. Furthermore, there are cases where the focus on enhancing reproducibility turns out not to foster high-quality research. Scientific communities and Open Science advocates should learn from inferential reasoning from irreproducible data, and promote incentives for all researchers to explicitly and publicly discuss (1) their methodological commitments, (2) the ways in which they learn from mistakes and problems in everyday practice, and (3) the strategies they use to choose which research component of any project needs to be preserved in the long term, and how.
    Found 1 week, 5 days ago on PhilSci Archive
  28. 1071924.8773
    This paper critically discusses an objection proposed by H. Nikolić against the naturalness of the stochastic dynamics implemented by the Bell-type Quantum Field Theory, an extension of Bohmian Mechanics able to describe the phenomena of particles creation and annihilation. Here I present: (i) Nikolić’s ideas for a pilot-wave theory accounting for QFT phenomenology evaluating the robustness of his criticism, (ii) Bell’s original proposal for a Bohmian QFT with a particle ontology and (iii) the mentioned Bell-type QFT. I will argue that although Bell’s model should be interpreted as a heuristic example showing the possibility to extend Bohm’s pilot-wave theory to the domain of QFT, the same judgement does not hold for the Bell-type QFT, which is candidate to be a promising possible alternative proposal to the standard version of quantum field theory. Finally, contra Nikolić, I will provide arguments in order to show how a stochastic dynamics is perfectly compatible with a Bohmian quantum theory.
    Found 1 week, 5 days ago on PhilSci Archive
  29. 1129340.877313
    Levels of organization are structures in nature, usually defined by part-whole relationships, with things at higher levels being composed of things at the next lower level. Typical levels of organization that one finds in the literature include the atomic, molecular, cellular, tissue, organ, organismal, group, population, community, ecosystem, landscape, and biosphere levels. References to levels of organization and related hierarchical depictions of nature are prominent in the life sciences and their philosophical study, and appear not only in introductory textbooks and lectures, but also in cutting-edge research articles and reviews.
    Found 1 week, 6 days ago on Stanford Encyclopedia of Philosophy
  30. 1177892.877326
    A satisfactory account of the nature of health is important for a wide range of theoretical and practical reasons. No theory offered in the literature thus far has been able to meet all the desiderata for an adequate theory of health. This paper introduces a new theory of health, according to which health is best defined in terms of dispositions at the level of the organism as a whole. After outlining the main features of the account and providing formal definitions of ‘health’, ‘healthy’, and ‘healthier than’, I present the main strengths of the proposed account. I argue that the proposed dispositional theory accounts for all paradigm cases of health and pathology, that it circumvents a number of problems faced by rival theories, and that it makes for a naturalistic theory of health with a rigorous metaphysical underpinning.
    Found 1 week, 6 days ago on PhilPapers