1. 86424.886108
    Information theory presupposes the notion of an epistemic agent, such as a scientist or an idealized human. Despite that, information theory is increasingly invoked by physicists concerned with fundamental physics, physics at very high energies, or generally with the physics of situations in which even idealized epistemic agents cannot exist. In this paper, I shall try to determine the extent to which the application of information theory in those contexts is legitimate. I will illustrate my considerations using the case of black hole thermodynamics and Bekenstein’s celebrated argument for his formula for the entropy of black holes. This example is particularly pertinent to the theme of the present collection because it is widely accepted as ‘empirical data’ in notoriously empirically deprived quantum gravity, even though the laws of black hole thermodynamics have so far evaded direct empirical confirmation.
    Found 1 day ago on PhilSci Archive
  2. 150807.886177
    For more than twenty five years, Fine has been challenging the traditional interpretation of the violations of Bell inequalities (BI) by experiment. A natural interpretation of Fine’s theorem is that it provides us with an alternative set of assumptions on which to put the blame for the failure of the BI, and a new interpretation of the violation of the BI by experiment should follow. This is not, however, how Fine interprets his theorem. Indeed, Fine claims that his result undermines other interpretations, including the traditional interpretation in terms of local realism. The aim of this paper is to understand and to assess Fine’s claims. We distinguish three different strategies that Fine uses in order to support his interpretation of his result. We show that none of these strategies is successful. Fine fails to prove that local realism is not at stake in the violation of the BI by quantum phenomena.
    Found 1 day, 17 hours ago on PhilSci Archive
  3. 150910.886196
    The counterfactual tradition to defining actual causation has come a long way since Lewis started it off. However there are still important open problems that need to be solved. One of them is the (in)transitivity of causation. Endorsing transitivity was a major source of trouble for the approach taken by Lewis, which is why currently most approaches reject it. But transitivity has never lost its appeal, and there is a large literature devoted to understanding why this is so. Starting from a survey of this work, we will develop a formal analysis of transitivity and the problems it poses for causation. This analysis provides us with a sufficient condition for causation to be transitive, a sufficient condition for dependence to be necessary for causation, and several characterisations of the transitivity of dependence. Finally, we show how this analysis leads naturally to several conditions a definition of causation should satisfy, and use those to suggest a new definition of causation.
    Found 1 day, 17 hours ago on Ergo
  4. 194861.886222
    There is a familiar philosophical position – sometimes called the doctrine of the open future – according to which future contingents (claims about underdetermined aspects of the future) systematically fail to be true. For instance: supposing that there are ways things could develop from here in which Trump is impeached, and in which he is not, it is not now true that Trump will be impeached, and not now true that Trump will not be impeached. For well over 2000 years, however, open futurists have been accused of denying certain logical laws – bivalence, excluded middle, or both – for entirely ad hoc reasons, most notably, that their denials are required for the preservation of something we hold dear. In a recent paper, however, I sought to argue that this deeply entrenched narrative ought to be overturned. My thought was this: given a popular, plausible approach to the semantics of future contingents, we can reduce the question of their status to the Russell/Strawson debate concerning presupposition failure, definite descriptions, and bivalence. In that case, we will see that open futurists in fact needn’t deny bivalence (Russell), or, if they do, they will do so for perfectly general (Strawsonian) reasons – reasons for which we all must deny bivalence. Of course, the metaphysical objections to the open futurist’s model of the future will remain just as they were. However, the millennia-old “semantic” or “logical” objections to the doctrine would be answered.
    Found 2 days, 6 hours ago on PhilPapers
  5. 201760.886239
    Computer simulation of an epistemic landscape model, modified to include explicit representation of a centralised funding body, show the method of funding allocation has significant effects on communal trade-off between exploration and exploitation, with consequences for the community’s ability to generate significant truths. The results show this effect is contextual, and depends on the size of the landscape being explored, with funding that includes explicit random allocation performing significantly better than peer-review on large landscapes. The paper proposes a way of incorporating external institutional factors in formal social epistemology, and offers a way of bringing such investigations to bear on current research policy questions.
    Found 2 days, 8 hours ago on PhilSci Archive
  6. 202535.886257
    In this paper I investigate whether certain substructural theories are able to dodge paradox while at the same time containing what might be viewed as a naive validity predicate. To this end I introduce the requirement of internalization, roughly, that an adequate theory of validity should prove that its own metarules are validity-preserving. The main point of the paper is that substructural theories fail this requirement in various ways.
    Found 2 days, 8 hours ago on Ergo
  7. 208970.886275
    11 August 1895 – 12 June 1980 Continuing with my Egon Pearson posts in honor of his birthday, I reblog a post by Aris Spanos:  “Egon Pearson’s Neglected Contributions to Statistics“. Egon Pearson (11 August 1895 – 12 June 1980), is widely known today for his contribution in recasting of Fisher’s significance testing into the Neyman-Pearson (1933) theory of hypothesis testing. …
    Found 2 days, 10 hours ago on D. G. Mayo's blog
  8. 209209.886301
    It’s been a long time since I’ve blogged about the Complex Adaptive System Composition and Design Environment or CASCADE project run by John Paschkewitz. For a reminder, read these: • Complex adaptive system design (part 1), Azimuth, 2 October 2016. …
    Found 2 days, 10 hours ago on Azimuth
  9. 259456.886318
    As Harvey Brown emphasizes in his book Physical Relativity, inertial motion in general relativity is best understood as a theorem, and not a postulate. Here I discuss the status of the “conservation condition”, which states that the energy-momentum tensor associated with non-interacting matter is covariantly divergence-free, in connection with such theorems.
    Found 3 days ago on PhilSci Archive
  10. 306866.886335
    The spectrum argument purports to show that the better-than relation is not transitive, and consequently that orthodox value theory is built on dubious foundations. The argument works by constructing a sequence of increasingly less painful but more drawn-out experiences, such that each experience in the spectrum is worse than the previous one, yet the final experience is better than the experience with which the spectrum began. Hence the betterness relation admits cycles, threatening either transitivity or asymmetry of the relation. This paper examines recent attempts to block the spectrum argument, using the idea that it is a mistake to affirm that every experience in the spectrum is worse than its predecessor: an alternative hypothesis is that adjacent experiences may be incommensurable in value, or that due to vagueness in the underlying concepts, it is indeterminate which is better. While these attempts formally succeed as responses to the spectrum argument, they have additional, as yet unacknowledged costs that are significant. In order to effectively block the argument in its most typical form, in which the first element is radically inferior to the last, it is necessary to suppose that the incommensurability (or indeterminacy) is particularly acute: what might be called radical incommensurability (radical indeterminacy). We explain these costs, and draw some general lessons about the plausibility of the available options for those who wish to save orthodox axiology from the spectrum argument.
    Found 3 days, 13 hours ago on PhilPapers
  11. 308194.886353
    The need for expressing temporal constraints in conceptual models is well-known, but it is unclear which representation is preferred and what would be easier to understand by modellers. We assessed five different modes of representing temporal constraints, being the formal semantics, Description logics notation, a coding-style notation, temporal EER diagrams, and (pseudo-)natural language sentences. The same information was presented to 15 participants in an experimental evaluation. Principally, it showed that 1) there was a clear preference for diagrams and natural language versus a dislike for other representations; 2) diagrams were preferred for simple constraints, but the natural language rendering was preferred for more complex temporal constraints; and 3) a multi-modal modelling tool will be needed for the data analysis stage to be effective.
    Found 3 days, 13 hours ago on C. Maria Keet's site
  12. 317094.886367
    In this paper I discuss the delayed choice quantum eraser experiment by giving a straightforward account in standard quantum mechanics. At first glance, the experiment suggests that measurements on one part of an entangled photon pair (the idler) can be employed to control whether the measurement outcome of the other part of the photon pair (the signal) produces interference fringes at a screen after being sent through a double slit. Significantly, the choice whether there is interference or not can be made long after the signal photon encounters the screen. The results of the experiment have been alleged to invoke some sort of ‘backwards in time influences’. I argue that in the standard collapse interpretation the issue can be eliminated by taking into account the collapse of the overall entangled state due to the signal photon. Likewise, in the de Broglie-Bohm picture the particle’s trajectories can be given a well-defined description at any instant of time during the experiment. Thus, there is no need to resort to any kind of ‘backwards in time influence’. As a matter of fact, the delayed choice quantum eraser experiment turns out to resemble a Bell-type measurement, and so there really is no mystery.
    Found 3 days, 16 hours ago on PhilSci Archive
  13. 324480.886392
    E.S. Pearson (11 Aug, 1895-12 June, 1980) This is a belated birthday post for E.S. Pearson (11 August 1895-12 June, 1980). It’s basically a post from 2012 which concerns an issue of interpretation (long-run performance vs probativeness) that’s badly confused these days. …
    Found 3 days, 18 hours ago on D. G. Mayo's blog
  14. 368442.88641
    There’s a new paper on the arXiv that claims to solve a hard problem: • Norbert Blum, A solution of the P versus NP problem. Most papers that claim to solve hard math problems are wrong: that’s why these problems are considered hard. …
    Found 4 days, 6 hours ago on Azimuth
  15. 368445.886425
    We owe to Frege in Begriffsschrift our modern practice of taking unrestricted quantification (in one sense)  as basic. I mean, he taught us how to rephrase restricted quantifications by using unrestricted quantifiers plus connectives in the now familiar way, so that e.g. …
    Found 4 days, 6 hours ago on Peter Smith's blog
  16. 646267.886441
    In this chapter, I will discuss what it takes for a dynamical collapse theory to provide a reasonable description of the actual world. I will start with discussions of what is required, in general, of the ontology of a physical theory, and then apply it to the quantum case. One issue of interest is whether a collapse theory can be a quantum state monist theory, adding nothing to the quantum state and changing only its dynamics. Although this was one of the motivations for advancing such theories, its viability has been questioned, and it has been argued that, in order to provide an account of the world, a collapse theory must supplement the quantum state with additional ontology, making such theories more like hidden-variables theories than would first appear. I will make a case for quantum state monism as an adequate ontology, and, indeed, the only sensible ontology for collapse theories. This will involve taking dynamical variables to possess, not sharp values, as in classical physics, but distributions of values.
    Found 1 week ago on PhilSci Archive
  17. 646292.886456
    I discuss a game-theoretic model in which scientists compete to finish the intermediate stages of some research project. Banerjee et al. (2014) have previously shown that if the credit awarded for intermediate results is proportional to their difficulty, then the strategy profile in which scientists share each intermediate stage as soon as they complete it is a Nash equilibrium. I show that the equilibrium is both unique and strict. Thus rational credit-maximizing scientists have an incentive to share their intermediate results, as long as this is sufficiently rewarded.
    Found 1 week ago on PhilSci Archive
  18. 696545.886472
    Persistence judgments are ordinary judgments about whether an object survives a change, or perishes. For instance, if a house fire only superficially damages the kitchen, people judge that the house survived. But if the fire burnt the house to the ground instead, people judge that the house did not survive but was instead destroyed. We are interested in what drives these judgments, in part because objects are so central to our conception of the world, and our persistence judgments get to the very heart of the folk notion of an object.
    Found 1 week, 1 day ago on PhilPapers
  19. 700408.886486
    In models for paraconsistent logics, the semantic values of sentences and their negations are less tightly connected than in classical logic. In “American Plan” logics for negation, truth and falsity are, to some degree, independent. The truth of ∼p is given by the falsity of p, and the falsity of ∼p is given by the truth of p. Since truth and falsity are only loosely connected, p and ∼p can both hold, or both fail to hold. In “Australian Plan” logics for negation, negation is treated rather like a modal operator, where the truth of ∼p in a situation amounts to p failing in certain other situations. Since those situations can be different from this one, p and ∼p might both hold here, or might both fail here.
    Found 1 week, 1 day ago on Greg Restall's site
  20. 779215.886501
    Illustration by Slate Last week a team of 72 scientists released the preprint of an article attempting to address one aspect of the reproducibility crisis, the crisis of conscience in which scientists are increasingly skeptical about the rigor of our current methods of conducting scientific research. …
    Found 1 week, 2 days ago on D. G. Mayo's blog
  21. 819045.886517
    Suppose that I am throwing a perfectly sharp dart uniformly randomly at a continuous target. The chance that I will hit the center is zero. What if I throw an infinite number of independent darts at the target? …
    Found 1 week, 2 days ago on Alexander Pruss's Blog
  22. 876901.886532
    The claim of inflationary cosmology to explain certain observable facts, which the Friedmann-Roberston-Walker models of ‘Big-Bang’ cosmology were forced to assume, has already been the subject of significant philosophical analysis. However, the principal empirical claim of inflationary cosmology, that it can predict the scale-invariant power spectrum of density perturbations, as detected in measurements of the cosmic microwave background radiation, has hitherto been taken at face value by philosophers. The purpose of this paper is to expound the theory of density perturbations used by inflationary cosmology, to assess whether inflation really does predict a scale-invariant spectrum, and to identify the assumptions necessary for such a derivation. The first section of the paper explains what a scale-invariant power-spectrum is, and the requirements placed on a cosmological theory of such density perturbations. The second section explains and analyses the concept of the Hubble horizon, and its behaviour within an inflationary space-time. The third section expounds the inflationary derivation of scale-invariance, and scrutinises the assumptions within that derivation. The fourth section analyses the explanatory role of ‘horizon-crossing’ within the inflationary scenario.
    Found 1 week, 3 days ago on PhilSci Archive
  23. 876919.886546
    In the context of superintelligent AI systems, the term “oracle” has two meanings. One refers to modular systems queried for domain-specific tasks. Another usage, referring to a class of systems which may be useful for addressing the value alignment and AI control problems, is a superintelligent AI system that only answers questions. The aim of this manuscript is to survey contemporary research problems related to oracles which align with long-term research goals of AI safety. We examine existing question answering systems and argue that their high degree of architectural heterogeneity makes them poor candidates for rigorous analysis as oracles. On the other hand, we identify computer algebra systems (CASs) as being primitive examples of domain-specific oracles for mathematics and argue that efforts to integrate computer algebra systems with theorem provers, systems which have largely been developed independent of one another, provide a concrete set of problems related to the notion of provable safety that has emerged in the AI safety community. We review approaches to interfacing CASs with theorem provers, describe well-defined architectural deficiencies that have been identified with CASs, and suggest possible lines of research and practical software projects for scientists interested in AI safety.
    Found 1 week, 3 days ago on PhilSci Archive
  24. 931130.886561
    We give a precise semantics for a proposed revised version of the Knowledge Interchange Format. We show that quantification over relations is possible in a first-order logic, but sequence variables take the language beyond first-order.
    Found 1 week, 3 days ago on Chris Menzel's site
  25. 1010481.886575
    We report on progress and an unsolved problem in our attempt to obtain a clear rationale for relevance logic via semantic decomposition trees. Suitable decomposition rules, constrained by a natural parity condition, generate a set of directly acceptable formulae that contains all axioms of the well-known system R, is closed under substitution and conjunction, satisfies the letter-sharing condition, but is not closed under detachment. To extend it, a natural recursion is built into the procedure for constructing decomposition trees. The resulting set of acceptable formulae has many attractive features, but it remains an open question whether it continues to satisfy the crucial letter-sharing condition.
    Found 1 week, 4 days ago on The Australasian Journal of Logic
  26. 1043861.886589
    J. D. Hamkins and O, “The modal logic of set-theoretic potentialism and the potentialist maximality principles.” (manuscript in preparation)   Citation arχiv @ARTICLE{HamkinsLinnebo:Modal-logic-of-set-theoretic-potentialism, author = {Joel David Hamkins and {\O}ystein Linnebo}, title = {The modal logic of set-theoretic potentialism and the potentialist maximality principles}, journal = {}, year = {}, volume = {}, number = {}, pages = {}, month = {}, note = {manuscript in preparation}, abstract = {}, keywords = {}, source = {}, eprint = {1708.01644}, archivePrefix = {arXiv}, primaryClass = {math.LO}, url = {http://jdh.hamkins.org/set-theoretic-potentialism}, doi = {}, } Abstract. …
    Found 1 week, 5 days ago on Joel David Hamkins's blog
  27. 1049951.886603
    The standard propositional account of necessary and sufficient conditions in many introductory logic textbooks is based on the material conditional. Some examples include (Barker-Plummer, Barwise, and Etchemendy 2011: 181-182), (Churchill 1986: 391-392), (Forbes 1994: 20-25), (Gabbay 2002: 68), (Haight 1999: 187-189), (Halverson 1984: 285- 286), (Hardegree 2011: 129), (Layman 2002: 250-251), (Leblanc and Wisdom 1976: 16-18), (Salmon 1984: 47-48), (P. Smith 2003: 132), (Suppes 1957: 8-10) and (Watson and Arp 2015: 149). In the appendix, pertinent excerpts from some of these resources are provided. In general, the typical exposition goes along the following lines (again, cf. the appendix): • “A is sufficient for B” is best rendered as “if A, then B”, or symbolically, (A ⊃ B). • “A is necessary for B” is best rendered as ”if not A, then not B”, or symbolically, (¬A ⊃ ¬B). This is equivalent to (B ⊃ A).
    Found 1 week, 5 days ago on The Australasian Journal of Logic
  28. 1050046.886618
    A central proposition of this book is that there are no universal rules for inductive inference. The chapters so far have sought to argue for this proposition and to illustrate it by showing how several popular accounts of inductive inference fail to provide universally applicable rules. Many in an influential segment of the philosophy of science community will judge these efforts to be mistaken and futile. In their view, the problem has been solved, finally and irrevocably.
    Found 1 week, 5 days ago on John Norton's site
  29. 1077503.886632
    We propose an investigation of the ways in which speakers’ subjective perspectives are likely to affect the meaning of gradable adjectives like tall or heavy. We present the results of a study showing that people tend to use themselves as a yardstick when ascribing these adjectives to human figures of variable measurements: subjects’ height and weight requirements for applying tall and heavy are found to be positively correlated with their personal measurements. We draw more general lessons regarding the definition of subjectivity and the ways in which a standard of comparison and a significant deviation of that standard are specified.
    Found 1 week, 5 days ago on Paul Egré's site
  30. 1077551.886646
    Recent ideas about epistemic modals and indicative conditionals in formal semantics have significant overlap with ideas in modal logic and dynamic epistemic logic. The purpose of this paper is to show how greater interaction between formal semantics and dynamic epistemic logic in this area can be of mutual benefit. In one direction, we show how concepts and tools from modal logic and dynamic epistemic logic can be used to give a simple, complete axiomatization of Yalcin’s [16] semantic consequence relation for a language with epistemic modals and indicative conditionals. In the other direction, the formal semantics for indicative conditionals due to Kolodny and MacFarlane [9] gives rise to a new dynamic operator that is very natural from the point of view of dynamic epistemic logic, allowing succinct expression of dependence (as in dependence logic) or supervenience statements. We prove decidability for the logic with epistemic modals and Kolodny and MacFarlane’s indicative conditional via a full and faithful computable translation from their logic to the modal logic K45.
    Found 1 week, 5 days ago on Wesley Holliday's site