1. 6233.845881
    An axiomatic theory of truth is a deductive theory of truth as a primitive undefined predicate. Because of the liar and other paradoxes, the axioms and rules have to be chosen carefully in order to avoid inconsistency. Many axiom systems for the truth predicate have been discussed in the literature and their respective properties been analysed. Several philosophers, including many deflationists, have endorsed axiomatic theories of truth in their accounts of truth. The logical properties of the formal theories are relevant to various philosophical questions, such as questions about the ontological status of properties, Gödel’s theorems, truth-theoretic deflationism, eliminability of semantic notions and the theory of meaning.
    Found 1 hour, 43 minutes ago on Stanford Encyclopedia of Philosophy
  2. 6292.845938
    We discuss some recent work by Tim Maudlin concerning Black Hole Information Loss. We argue, contra Maudlin, that there is a paradox, in the straightforward sense that there are propositions that appear true, but which are incompatible with one another. We discuss the significance of the paradox and Maudlin’s response to it.
    Found 1 hour, 44 minutes ago on PhilSci Archive
  3. 44850.845956
    In this paper, we provide a Bayesian analysis of the well-known surprise exam paradox. Central to our analysis is a probabilistic account of what it means for the student to accept the teacher’s announcement that he will receive a surprise exam. According to this account, the student can be said to have accepted the teacher’s announcement provided he adopts a subjective probability distribution relative to which he expects to receive the exam on a day on which he expects not to receive it. We show that as long as expectation is not equated with subjective certainty there will be contexts in which it is possible for the student to accept the teacher’s announcement, in this sense. In addition, we show how a Bayesian modeling of the scenario can yield plausible explanations of the following three intuitive claims: (1) the teacher’s announcement becomes easier to accept the more days there are in class; (2) a strict interpretation of the teacher’s announcement does not provide the student with any categorical information as to the date of the exam; and (3) the teacher’s announcement contains less information about the date of the exam the more days there are in class. To conclude, we show how the surprise exam paradox can be seen as one among the larger class of paradoxes of doxastic fallibilism, foremost among which is the paradox of the preface.
    Found 12 hours, 27 minutes ago on PhilPapers
  4. 53794.84597
    One well-known objection to the principle of maximum entropy is the so-called Judy Benjamin problem, first introduced by van Fraassen ([1981]). The problem turns on the apparently puzzling fact that, on the basis of information relating an event’s conditional probability, the maximum entropy distribution will almost always assign to the event conditionalized on a probability strictly less than that assigned to it by the uniform distribution. In this paper, I present an analysis of the Judy Benjamin problem that can help to make sense of this seemingly odd feature of maximum entropy inference. My analysis is based on the claim that, in applying the principle of maximum entropy, Judy Benjamin is not acting out of a concern to maximize uncertainty in the face of new evidence, but is rather exercising a certain brand of epistemic charity towards her informant. This epistemic charity takes the form of an assumption on the part of Judy Benjamin that her informant’s evidential report leaves out no relevant information. Such a reconceptualization of the motives underlying Judy Benjamin’s appeal to the principle of maximum entropy can help to further our understanding of the true epistemological grounds of this principle and, in particular, can shed light on the nature of the relationship between the principle of maximum entropy and the Laplacean principle of insufficient reason.
    Found 14 hours, 56 minutes ago on PhilPapers
  5. 105905.845988
    I will discuss two arguments in favor of perdurance. The first is Sider’s argument from vagueness. Sider regards this argument as “one of the most powerful” in favor of perdurantism. I make the observation – obvious once made, but I am unable to find it elsewhere in the literature – that endurantists have principled grounds to claim that the argument is unsound (§§I–III). Having made this observation, I use it to emphasize a somewhat neglected difference between endurantists and perdurantists with respect to their views on material objects (§IV). These views, in the case of endurantists, lead to a further, less than conclusive but nevertheless interesting argument against endurantism – the anti-fundamentality argument – which I discuss in the second half of the paper (§§V–VI).
    Found 1 day, 5 hours ago on Antony Eagle's site
  6. 105985.846001
    A previously unrecognised argument against deterministic chance is introduced. The argument rests on the twin ideas that determined outcomes are settled, while chancy outcomes are unsettled, thus making cases of determined but chancy outcomes impossible. Closer attention to tacit assumptions about settledness makes available some principled lines of resistance to the argument for compatibilists about chance and determinism. Yet the costs of maintaining compatibilism may be higher with respect to this argument than with respect to existing incompatibilist arguments.
    Found 1 day, 5 hours ago on Antony Eagle's site
  7. 109244.846015
    Taylor & Francis makes every effort to ensure the accuracy of all the information (the “Content”) contained in the publications on our platform. However, Taylor & Francis, our agents, and our licensors make no representations or warranties whatsoever as to the accuracy, completeness, or suitability for any purpose of the Content. Any opinions and views expressed in this publication are the opinions and views of the authors, and are not the views of or endorsed by Taylor & Francis. The accuracy of the Content should not be relied upon and should be independently verified with primary sources of information. Taylor and Francis shall not be liable for any losses, actions, claims, proceedings, demands, costs, expenses, damages, and other liabilities whatsoever or howsoever caused arising directly or indirectly in connection with, in relation to or arising out of the use of the Content.
    Found 1 day, 6 hours ago on Oisín Deery's site
  8. 112771.846031
    An increasingly popular account of logic, Anti-Exceptionalism, views logic as similar to, and continuous with, other scientific theories. It thus treats revision of logic analogously to revision of scientific theories, applying familiar abductive standards of scientific theory choice to the case of logic. We should, that is, move from one logical theory L to another L when L does “better” than L in terms of theoretical virtues like: ...simplicity, ontological leanness (Occam’s razor), explanatory power, a low degree of ad hocness, unity, [and] fruitfulness. (Priest 2006: 135) It’s intended to explain rational change of logic; nothing so detailed is needed to explain vacillating flirtations we might have with one logic or another. Abductive methodology is supposed to provide justification for moving from one logic to another. One whole body of logic, that is: the particular and common version of this methodology I’m here interested in isn’t aimed at settling whether we should revise any particular logical principle.
    Found 1 day, 7 hours ago on PhilPapers
  9. 113025.846044
    We give a probabilistic justification of the shape of one of the probability weighting functions used in Prospect Theory. To do so, we use an idea recently introduced by Herzog and Hertwig (2014). Along the way we also suggest a new method for the aggregation of probabilities using statistical distances.
    Found 1 day, 7 hours ago on PhilPapers
  10. 115836.846057
    This symposium in the overlap of philosophy and decision theory is described well by its title “Beliefs in Groups”. Each word in the title matters, with one intended ambiguity. The symposium is about beliefs rather than other attitudes such as preferences; these beliefs take the form of probabilities in the first three contributions, binary yes/no beliefs (‘judgments’) in the fourth contribution, and qualitative probabilities (‘probability grades’) in the fifth contribution. The beliefs occur in groups, which is ambiguous between beliefs of groups as a whole and beliefs of group members. The five contributions – all of them interesting, we believe – address several aspects of this general theme.
    Found 1 day, 8 hours ago on Franz Dietrich's site
  11. 121661.846071
    Traditionally, empiricism has relied on the specialness of human observation, yet science is rife with sophisticated instrumentation and techniques. The present paper advances a conception of empirical evidence applicable to actual scientific practice. I argue that this conception elucidates how the results of scientific research can be repurposed across diverse epistemic contexts—it helps to make sense of how evidence accumulates across theory change, how different evidence can be amalgamated and used jointly, and how the same evidence can be used to constrain competing theories in the service of breaking local underdetermination.
    Found 1 day, 9 hours ago on PhilSci Archive
  12. 121762.846084
    This paper reviews the structure of standard quantum mechanics, introducing the basics of the von Neumann-Dirac axiomatic formulation as well as the well-known Copenhagen interpretation. We review also the major conceptual difficulties arising from this theory, first and foremost, the well-known measurement problem. The main aim of this essay is to show the possibility to solve the conundrums affecting quantum mechanics via the methodology provided by the primitive ontology approach. Using Bohmian mechanics as an example, the paper argues for a realist attitude towards quantum theory. In the second place, it discusses the Quinean criterion for ontology and its limits when it comes to quantum physics, arguing that the primitive ontology programme should be considered as an improvement on Quine’s method in determining the ontological commitments of a theory.
    Found 1 day, 9 hours ago on PhilSci Archive
  13. 133762.846097
    The mereological predicate ‘is part of’ can be used to define the predicate ‘is identical with’. I argue that this entails that mereological theories can be ideologically simpler than nihilistic theories that do not use the notion of parthood—contrary to what has been argued by Ted Sider. Moreover, if one accepts an extensional mereology, there are good philosophical reasons apart from ideological simplicity to give a mereological definition of identity.
    Found 1 day, 13 hours ago on PhilPapers
  14. 282746.84611
    In this post I want to argue for this: If a computer can non-accidentally have free will, compatibilism is true. Compatibilism here is the thesis that free will and determinism can both obtain. My interest in (1) is that I think the compatibilism is false, and hence I conclude from (1) that computers cannot non-accidentally have free will. …
    Found 3 days, 6 hours ago on Alexander Pruss's Blog
  15. 426747.846126
    . Stephen Senn Head of  Competence Center for Methodology and Statistics (CCMS) Luxembourg Institute of Health Twitter @stephensenn Being a statistician means never having to say you are certain A recent discussion of randomised controlled trials[1] by Angus Deaton and Nancy Cartwright (D&C) contains much interesting analysis but also, in my opinion, does not escape rehashing some of the invalid criticisms of randomisation with which the literatures seems to be littered. …
    Found 4 days, 22 hours ago on D. G. Mayo's blog
  16. 609966.846163
    Consider the principle that for a given agent S, and any proposition p, it is metaphysically possible that S is thinking p, and p alone, at time t. According to philosophical folklore, this principle cannot be true, despite its initial appeal, because there are more propositions than possible worlds: the principle would require a different possible world to witness the thinking of each proposition, and there simply aren’t enough possible worlds to go around. Some theorists have taken comfort in the thought that, when taken in conjunction with facts about human psychology, the principle was not on particularly firm footing to begin with: most propositions are far too complicated for any human to grasp, much less think uniquely.
    Found 1 week ago on Andrew Bacon's site
  17. 682383.846184
    Conditional probability is one of the central concepts in probability theory. Some notion of conditional probability is part of every interpretation of probability. The basic mathematical fact about conditional probability is that p(A|B) = p(A ∧ B)/p(B) where this is defined. However, while it has been typical to take this as a definition or analysis of conditional probability, some (perhaps most prominently Hajek (2003)) have argued that conditional probability should instead be taken as the primitive notion, so that this formula is at best coextensive, and at worst sometimes gets it wrong. Section 1 of this article considers the notion of conditional probability, and the two main sorts of arguments that it should be taken as primitive, as well as some mathematical principles that have been alleged to be important for it. Sections 2 and 3 then describe the two main competing mathematical formulations of conditional probability.
    Found 1 week ago on Kenny Easwaran's site
  18. 735712.84621
    Can an agent hold a meaningful credence that she will perform an action, as she decides whether to do so? No, say some (e.g., Spohn, 1977; Levi, 1997), for Deliberation Crowds Out Prediction (DCOP). In our view the DCOP debate pays insufficient attention to the meaning of terms such as ‘credence’, ‘deliberation’, and ‘agent’. We note below that if these terms are understood sufficiently broadly, it is trivial that DCOP fails. Nevertheless, we argue, there are familiar understandings of these terms such that DCOP holds, for reasons related to the so-called ‘transparency’ of an agent’s present-tensed access to certain of her own psychological states. In that sense, we defend DCOP, explaining it in terms of transparency.
    Found 1 week, 1 day ago on Huw Price's site
  19. 745600.846224
    There are at least two traditional conceptions of numerical degree of similarity. According to the rst, the degree of dissimilarity between two particulars is their distance apart in a metric space. According to the second, the degree of similarity between two particulars is a function of the number of (sparse) properties they have in common and not in common. This paper argues that these two conceptions are logically independent, but philosophically inconsonant.
    Found 1 week, 1 day ago on PhilPapers
  20. 754191.846238
    In a recent paper, Wigglesworth claims that syntactic criteria of theoretical equivalence are not appropriate for settling questions of equivalence between logical theories, since such criteria judge classical and intuitionistic logic to be equivalent; he concludes that logicians should use semantic criteria instead. However, this is an artefact of the particular syntactic criterion chosen, which is an implausible criterion of theoretical equivalence (even in the non-logical case). Correspondingly, there is nothing to suggest that a more plausible syntactic criterion should not be used to settle questions of equivalence between different logical theories; such a criterion (which may already be found in the literature) is exhibited and shown to judge classical and intuitionistic logic to be inequivalent.
    Found 1 week, 1 day ago on PhilSci Archive
  21. 754213.846251
    I present a formal ontological theory where the basic building blocks of the world can be either things or events. In any case, the result is a Parmenidean worldview where change is not a global property. What we understand by change manifests as asymmetries in the pattern of the world-lines that constitute 4-dimensional existents. I maintain that such a view is in accord with current scientific knowledge.
    Found 1 week, 1 day ago on PhilSci Archive
  22. 922313.846265
    It has recently been argued that a non-Bayesian probabilistic version of inference to the best explanation (IBE*) has a number of advantages over Bayesian conditionalization (Douven [2013]; Douven and Wenmackers [2017]). We investigate how IBE* could be generalized to uncertain evidential situations and formulate a novel updating rule IBE**. We then inspect how it performs in comparison to its Bayesian counterpart, Jeffrey conditionalization (JC), in a number of simulations where two agents, each updating by IBE** and JC, respectively, try to detect the bias of a coin while they are only partially certain what side the coin landed on. We show that IBE** more often prescribes high probability to the actual bias than JC. We also show that this happens considerably faster, that IBE** passes higher thresholds for high probability, and that it in general leads to more accurate probability distributions than JC.
    Found 1 week, 3 days ago on PhilPapers
  23. 964888.846279
    Johannes Kepler loved geometry, so of course he was fascinated by Platonic solids. His early work Mysterium Cosmographicum, written in 1596, includes pictures showing how the 5 Platonic solids correspond to the 5 elements: Five elements? …
    Found 1 week, 4 days ago on Azimuth
  24. 1046148.846292
    This paper proposes a reading of the history of equivalence in mathematics. The paper has two main parts. The first part focuses on a relatively short historical period when the notion of equivalence is about to be decontextualized, but yet, has no commonly agreed-upon name. The method for this part is rather straightforward: following the clues left by the others for the ‘first’ modern use of equivalence. The second part focuses on a relatively long historical period when equivalence is experienced in context. The method for this part is to strip the ideas from their set-theoretic formulations and methodically examine the variations in the ways equivalence appears in some prominent historical texts. The paper reveals several critical differences in the conceptions of equivalence at different points in history that are at variance with the standard account of the mathematical notion of equivalence encompassing the concepts of equivalence relation and equivalence class.
    Found 1 week, 5 days ago on PhilSci Archive
  25. 1095441.846308
    On the basis of the finding that the principle of causal closure permits causal connections between universes (Gamper 2017) Gamper has suggested a new line of research, scientific ontology (manuscript), which addresses the modal properties of a universe that can be joined with another universe via an interface. In this study a preliminary issues is penetrated. To solve a threatening inconsistency between two central definitions in Gamper (2017) I propose a hypothesis that aligns them. The hypothesis, it is found, also offers a new interpretation of the quantum physics wave function.
    Found 1 week, 5 days ago on PhilPapers
  26. 1152773.846321
    According to doxastic pragmatism, certain perceived practical factors, such as high stakes and urgency, have systematic effects on normal subjects’ outright beliefs. Endorsement of doxastic pragmatism can be found in Weatherson (2005), Bach (2005, 2008, 2010), Ganson (2008) and Nagel (2008, 2010). Upholders of doxastic pragmatism have so far endorsed a particular version of this view, which we may call threshold pragmatism. This view holds that the sensitivity of belief to the relevant practical factors is due to a corresponding sensitivity of the threshold on the degree of credence necessary for outright belief. According to an alternative but yet unrecognised version of doxastic pragmatism, practical factors affect credence rather than the threshold on credence. Let’s call this alternative view credal pragmatism. In this paper, I argue that credal pragmatism is more plausible than threshold pragmatism. I show that the former view better accommodates a cluster of intuitive and empirical data. I conclude by considering the issue of whether our doxastic attitudes’ sensitivity to practical factors can be considered rational, and if yes, in what sense.
    Found 1 week, 6 days ago on PhilPapers
  27. 1152822.846334
    In the biomedical context, policy makers face a large amount of potentially discordant evidence from different sources. This prompts the question of how this evidence should be aggregated in the interests of best-informed policy recommendations. The starting point of our discussion is Hunter and Williams’ recent work on an automated aggregation method for medical evidence. Our negative claim is that it is far from clear what the relevant criteria for evaluating an evidence aggregator of this sort are. What is the appropriate balance between explicitly coded algorithms and implicit reasoning involved, for instance, in the packaging of input evidence? In short: What is the optimal degree of ‘automation’? On the positive side: We propose the ability to perform an adequate robustness analysis (which depends on the nature of the input variables and parameters of the aggregator) as the focal criterion, primarily because it directs efforts to what is most important, namely, the structure of the algorithm and the appropriate extent of automation. Moreover, where there are resource constraints on the aggregation process, one must also consider what balance between volume of evidence and accuracy in the treatment of individual evidence best facilitates inference. There is no prerogative to aggregate the total evidence available if this would in fact reduce overall accuracy.
    Found 1 week, 6 days ago on PhilPapers
  28. 1161274.846347
    In this paper we present a new categorical approach which attempts to provide an original understanding of QM. Our logos categorical approach attempts to consider the main features of the quantum formalism as the standpoint to develop a conceptual representation that explains what the theory is really talking about —rather than as problems that need to be bypassed in order to allow a restoration of a classical “common sense” understanding of what there is. In particular, we discuss a solution to Kochen-Specker contextuality through the generalization of the meaning of global valuation. This idea has been already addressed by the so called topos approach to QM —originally proposed by Isham, Butterfiled and Doring— in terms of sieve-valued valuations. The logos approach to QM presents a different solution in terms of the notion of intensive valuation. This new solution stresses an ontological (rather than epistemic) reading of the quantum formalism and the need to restore an objective (rather than classical) conceptual representation and understanding of quantum physical reality.
    Found 1 week, 6 days ago on PhilSci Archive
  29. 1161319.846362
    One of the critical problems with the classical philosophy of science is that it has not been quantitative in the past. But today the modern quantitative theory of information gives us the mathematical tools that are needed to make philosophy quantitative for the first time. A quantitative philosophy of science can provide vital insights into critical scientific questions ranging from the nature and properties of a Theory of Everything (TOE) in physics to the quantitative implications of Goedel’s celebrated incompleteness theorem for mathematics and physics. It also provides us with something that was conspicuously lacking in Kuhn’s famous book (1962) that introduced the idea of on paradigm shifts: a precise definition of a paradigm. This paper will begin to investigate these and other philosophical implications of the modern quantitative theory of information.
    Found 1 week, 6 days ago on PhilSci Archive
  30. 1161342.846376
    According to the Butterfield–Isham proposal, to understand quantum gravity we must revise the way we view the universe of mathematics. However, this paper demonstrates that the current elaborations of this programme neglect quantum interactions. The paper then introduces the Faddeev–Mickelsson anomaly which obstructs the renormalization of Yang–Mills theory, suggesting that to theorise on many-particle systems requires a many-topos view of mathematics itself: higher theory. As our main contribution, the topos theoretic framework is used to conceptualise the fact that there are principally three different quantisation problems, the differences of which have been ignored not just by topos physicists but by most philosophers of science. We further argue that if higher theory proves out to be necessary for understanding quantum gravity, its implications to philosophy will be foundational: higher theory challenges the propositional concept of truth and thus the very meaning of theorising in science.
    Found 1 week, 6 days ago on PhilSci Archive