The purpose of this paper is to present a paraconsistent formal system and a corresponding intended interpretation according to which true contradictions are not tolerated. Contradictions are, instead, epistemically understood as conflicting evidence, where evidence for a proposition A is understood as reasons for believing that A is true. The paper defines a paraconsistent and paracomplete natural deduction system, called the Basic Logic of Evidence (BLE ), and extends it to the Logic of Evidence and Truth (LETJ). The latter is a logic of formal inconsistency and undeterminedness that is able to express not only preservation of evidence but also preservation of truth. LETJ is anti-dialetheist in the sense that, according to the intuitive interpretation proposed here, its consequence relation is trivial in the presence of any true contradiction. Adequate semantics and a decision method are presented for both BLE and LETJ , as well as some technical results that fit the intended interpretation.
Network analysis needs tools to infer distributions over graphs of arbitrary size from a single graph. Assuming the distribution is generated by a continuous latent space model which obeys certain natural symmetry and smoothness properties, we establish three levels of consistency for non-parametric maximum likelihood inference as the number of nodes grows: (i) the estimated locations of all nodes converge in probability on their true locations; (ii) the distribution over locations in the latent space converges on the true distribution; and (iii) the distribution over graphs of arbitrary size converges.
The problem of the direction of the electromagnetic arrow of time is perhaps the most perplexing of the major unsolved problems of contemporary physics, because the usual tools of theoretical physics cannot be used to investigate it. Even the clues provided by the CP violation of the K 2 meson, which have led to a profound insight into the dominance of matter over antimatter in the universe, have not shed any light on the problem of the origin of the electromagnetic arrow of time.
One response to the problem of logical omniscience in standard possible worlds models of belief is to extend the space of worlds so as to include impossible worlds. It is natural to think that essentially the same strategy can be applied to probabilistic models of partial belief, for which parallel problems also arise. In this paper, I note a difficulty with the inclusion of impossible worlds into probabilistic models. Under weak assumptions about the space of worlds, most of the propositions which can be constructed from possible and impossible worlds are in an important sense inexpressible; leaving the probabilistic model committed to saying that agents in general have at least as many attitudes towards inexpressible propositions as they do towards expressible propositions. If it is reasonable to think that our attitudes are generally expressible, then a model with such commitments looks problematic.
There are various equivalent formulations of the Church-Turing thesis. A common one is that every effective computation can be carried out by
a Turing machine. The Church-Turing thesis is often misunderstood,
particularly in recent writing in the philosophy of mind.
After a brief presentation of Feynman diagrams, we criticizise the idea that Feynman diagrams can be considered to be pictures or depictions of actual physical processes. We then show that the best interpretation of the role they play in quantum field theory and quantum electrodynamics is captured by Hughes' Denotation, Deduction and Interpretation theory of models (DDI), where “models” are to be interpreted as inferential, non-representational devices constructed in given social contexts by the community of physicists.
By perfectly fine I mean: not at all morally blameworthy. By aiming I mean: being ready to calibrate ourselves up or down to hit the target. I would contrast aiming with settling, which does not necessarily involve calibrating down if one is above target. …
Joseph Halpern and Judea Pearl () draw upon structural equation models to develop an attractive analysis of ‘actual cause’. Their analysis is designed for the case of deterministic causation. I show that their account can be naturally extended to provide an elegant treatment of probabilistic causation.
September’s general elections have brought Germany its own Brexit/Trump moment. For the first time since 1945 a far right nationalist party is part of the German national parliament. The Alternative for Germany, AfD, gained 12,6 % of German votes. …
This paper is about the putative theoretical virtue of strength, as it might be used in abductive arguments to the correct logic in the epistemology of logic. It argues for three theses. The first is that the well-defined property of logical strength is neither a virtue nor a vice, so that logically weaker theories are not—all other things being equal—worse or better theories than logically stronger ones. The second thesis is that logical strength does not entail the looser characteristic of scientific strength, and the third is that many modern logics are on a par—or can be made to be on a par—with respect to scientific strength.
As Feynman (1982) observed, “we always have had a great deal of difficulty in understanding the world view that quantum mechanics represents” (471). Among the perplexing aspects of quantum mechanics is its seeming, on a wide variety of presently live realist interpretations (including but not limited to the so-called ‘orthodox’ interpretation), to violate the classical supposition of ‘value definiteness’, according to which the properties—a.k.a. ‘observables’—of a given particle or system have precise values at all times. Indeed, value indefiniteness lies at the heart of what is supposed to be distinctive about quantum phenomena, as per the following classic cases:
Facts, philosophers like to say, are opposed to theories and to values
(cf. Rundle 1993) and are to be distinguished from things, in
particular from complex objects, complexes and wholes, and from
relations. They are the objects of certain mental states and acts,
they make truth-bearers true and correspond to truths, they are part
of the furniture of the world. We present and discuss some
philosophical and formal accounts of facts.
“Intuitionistic logic” is a term that unfortunately gains
ever greater currency; it conveys a wholly false view on
intuitionistic mathematics. —Freudenthal 1937
Intuitionistic logic is an offshoot of L.E.J. Brouwer’s
intuitionistic mathematics. A widespread misconception has it that
intuitionistic logic is the logic underlying Brouwer’s
intuitionism; instead, the intuitionism underlies the logic, which is
construed as an application of intuitionistic mathematics to language. Intuitionistic mathematics consists in the act of effecting mental
constructions of a certain kind. These are themselves not linguistic
in nature, but when acts of construction and their results are
described in a language, the descriptions may come to exhibit
The method of explication has been somewhat of a hot topic in the last ten years. Despite the multifaceted research that has been directed at the issue, one may perceive a lack of step-by-step procedural or structural accounts of explication. This paper aims at providing a structural account of the method of explication in continuation of the works of Geo Siegwart. It is enhanced with a detailed terminology for the assessment and comparison of explications. The aim is to provide means to talk about explications including their criticisms and their interrelations. There is hope that this treatment will be able to serve as a foundation to a step-by-step guide to be established for explicators. At least it should help to frame and mediate explicative disputes. In closing the enterprise will be considered an explication of ‘explication’, though consecutive explications improving on this one are undoubtedly conceivable.
This paper introduces and examines the prospects of the recent research in a holographic relation between entanglement and spacetime pioneered by Mark van Raamsdonk and collaborators. Their thesis is that entanglement in a holographic quantum state is crucial for connectivity in its spacetime dual. Utilizing this relation, the paper develops a thought experiment that promises to probe the nature of spacetime by monitoring the behavior of a spacetime when all entanglement is removed between local degrees of freedom in its dual quantum state. The thought experiment suggests a picture of spacetime as consisting of robust nodes that are connected by non-robust bulk spacetime that is sensitive to changes in entanglement in the dual quantum state. However, rather than pursuing the thought experiment in further detail, the credibility of the relation between spacetime and entanglement in this zero entanglement limit is questioned. The energy of a quantum system generally increases when all entanglement is removed between subsystems, and so does the energy of its spacetime dual. If a system is subdivided into an infinite number of subsystems and all entanglement between them is removed, then the energy of the quantum system and the energy of its spacetime dual are at risk of diverging. While this is a prima facie worry for the thought experiment, it does not constitute a conclusive refutation.
The previous chapter examined the inductive logic applicable to an infinite lottery machine. Such a machine generates a countably infinite set of outcomes, that is, there are as many outcomes as natural numbers, 1, 2, 3, … We found there that, if the lottery machine is to operate without favoring any particular outcome, the inductive logic native to the system is not probabilistic. A countably infinite set is the smallest in the hierarchy of infinities. The next routinely considered is a continuum-sized set, such as given by the set of all real numbers or even just by the set of all real numbers in some interval, from, say, 0 to 1.
Darwin and contemporary biologists argue that all present-day life traces back to one or a few common ancestors. Here we investigate the relationship of different evolutionary processes to this hypothesis of common ancestry. We identify the property of an evolutionary process that determines what its probabilistic impact on the common ancestry thesis will be. The point of this exercise is to understand how the parts of Darwin's powerful theory fit together, not to call into question common ancestry or natural selection, since these two pillars of Darwin's theory enjoy strong support.
Parthood is used widely in ontologies across subject domains. Some modelling guidance can be gleaned from Ontology, yet it offers multiple mereological theories, and even more when combined with topology, i.e., mereotopology. To complicate the landscape, decidable languages put restrictions on the language features, so that only fragments of the mereo(topo)logical theories can be represented, yet during modelling, those full features may be needed to check correctness. We address these issues by specifying a structured network of theories formulated in multiple logics that are glued together by the various linking constructs of the Distributed Ontology Language, DOL. For the KGEMT mereotopological theory and five sub-theories, together with the DL-based OWL species and first- and second-order logic, this network in DOL orchestrates 28 ontologies. Further, we propose automated steps toward resolution of language feature conflicts when combining modules, availing of the new ‘OWL classifier’ tool that pinpoints profile violations.
N o one has done more over the past four decades to draw attention to the importance of, and attempt to solve, a particularly vexing problem in ethics—the Trolley Problem—than Judith Jarvis Thomson. Though the problem is originally due to Philippa Foot, Thomson showed how Foot’s simple solution would not do and offered some of her own.1 No solution is uncontroversial and the problem remains a thorn in the side of non-consequentialist moral theory. Recently, however, Thomson has changed her mind about the problem. She no longer thinks she was right to reject Foot’s solution to it. I argue that, though illuminating, Thomson’s current take on the Trolley Problem is mistaken. I end with a solution to the problem that I find promising. In sections 1–3, I present Thomson’s version of the Trolley Problem (one involving a twist on Foot’s original version) and her various responses to it. In sections 4 and 5, I evaluate her various takes on the problem, including her most recent rejection of the problem. In section 6, I offer a diagnosis of the purported data on the basis of which Thomson has mistakenly come to reject the problem. And in section 7, I present and defend my own preferred solution to the Trolley Problem.
Population ethics is the study of the unique ethical issues that arise when one’s actions can change who will come into existence: actions that lead to additional people being born, fewer people being born, or different people being born. The most obvious cases are those of an individual deciding whether to have a child, or of society setting the social policies surrounding procreation. However, issues of population ethics come up much more widely than this. How bad is it if climate change reduces the planet’s “carrying capacity”? How important is it to lower the risks of human extinction? How important is it, if at all, that humanity eventually seeks a future beyond Earth, allowing a much greater population?
The aim of this paper is to argue that all—or almost all—logical rules have exceptions. In particular, I argue that this is a moral that we should draw from the semantic paradoxes. Of course, responding to the paradoxes by revising classical logic in some way is familiar: see, e.g., Kripke , Priest [1987/2006], Soames , Maudlin , Field  and Beall . But such proposals tend to advocate replacing classical logic with some alternative logic. That is, some alternative system of rules—where, of course, it is taken for granted that these alternatives hold without exception.
In some sense, it is clear that the numbers count. That is, it is clear that the number of thinkers on a given side of a disputed issue is typically relevant to the degree of support their opinions provide. It is natural to think that numbers cannot be all that matter, though, for the extent to which the opinions are independent also seems to have substantial epistemic import. It is difficult, however, to capture explicitly the type of dependence and independence that can play this epistemic role. This paper investigates the issue, putting forward an expectational account of belief dependence and independence – one that can be applied whether we think in terms of credences or in terms of all-or-nothing beliefs.
The principles of Conditional Excluded Middle (CEM) and Simplification of Disjunctive Antecedents (SDA) have received substantial attention in isolation. Both principles are plausible generalizations about natural language conditionals. There is however little or no discussion of their interaction. This paper aims to remedy this gap and explore the significance of having both principles constrain the logic of the conditional. Our negative finding is that, together with elementary logical assumptions, CEM and SDA yield a variety of implausible consequences. Despite these incompatibility results, we open up a narrow space to satisfy both. We show that, by simultaneously appealing to the alternative-introducing analysis of disjunction and to the theory of homogeneity presuppositions, we can satisfy both. Furthermore, the theory that validates both principles resembles a recent semantics that is defended by Santorio on independent grounds. The cost of this approach is that it must give up the transitivity of entailment: we suggest that this is a feature, not a bug, and connect it with with recent developments of intransitive notions of entailment.
David Lewis describes his “modal realism” as a philosopher’s paradise. The present essay applies this paradise in ways that Lewis himself did not pursue. This essay begins to develop one idea: A modal realist account of necessary entities and possible worlds solves a significant theological problem, the problem of evil. Lewis’ modal realism, appropriately adjusted, enables a persuasive solution to the theological problem of evil. It furthermore satisfies the spirit of Leibniz’ thesis that the universe is the best possible world. A central problem of Western monotheism is how an omnipotent, omniscient, and perfectly good and loving God is compatible with the evil aspects of the actual world. Modal realism explains the existence of very imperfect worlds as not only compatible with such a God, but required by the existence of such a God.
In the May 15, 1935 issue of Physical Review Albert Einstein
co-authored a paper with his two postdoctoral research associates at
the Institute for Advanced Study, Boris Podolsky and Nathan Rosen. The
article was entitled “Can Quantum Mechanical Description of
Physical Reality Be Considered Complete?” (Einstein et
al. 1935). Generally referred to as “EPR”, this paper
quickly became a centerpiece in debates over the interpretation of
quantum theory, debates that continue today. Ranked by impact, EPR is
among the top ten of all papers ever published in Physical
I’ve been slacking off on writing this series of posts… but for a good reason: I’ve been busy writing a paper on the same topic! In the process I caught a couple of mistakes in what I’ve said so far. But more importantly, there’s a version out now, that you can read:
• John Baez, John Foley, Blake Pollard and Joseph Moeller, Network models. …
A controversial principle in Catholic moral theology is the principle of “counseling the lesser evil”, sometimes confusingly (or confusedly) presented as the “principle of the lesser evil”. The principle is one that the Church has not pronounced on. …
This paper discusses how to update one’s credences based on evidence that has initial probability 0. I advance a diachronic norm, Kolmogorov Conditionalization, that governs credal reallocation in many such learning scenarios. The norm is based upon Kolmogorov’s theory of conditional probability. I prove a Dutch book theorem and converse Dutch book theorem for Kolmogorov Conditionalization. The two theorems establish Kolmogorov Conditionalization as the unique credal reallocation rule that avoids a sure loss in the relevant learning scenarios.
What is the Problem of Universals? In this paper we take up the classic question and proceed as follows. In Sect. 1 we consid er three problem solving settings and define the notion of problem solving accordingly. Basically I say that to solve problems is to eliminate undesirable, unspecified, or apparently incoherent scenarios. In Sect. 2 we apply the general observations from Sect. 1 to the Problem of Universals . M ore specifically, we single out two accounts of the problem w hich are based on the idea of eliminating apparently incoherent scenarios, and then propose mod ifications of those two accounts which, by contrast, are based on the idea of eliminating unspecified scenarios. In Sect. 3 we spell out two interesting ramifications.
In this article, I argue that it makes a moral difference whether an individual is worse off than she could have been. Here I part company with consequentialists such as Parfit and side with contractualists such as Scanlon. But, unlike some contractualists, I reject the view that all that matters is whether a principle can be justified to each particular individual, where such a justification is attentive to her interests, complaints, and other claims. The anonymous goodness of a distribution also matters. My attempt to reconcile contractualist and consequentialist approaches proceeds via a serious of reflections on cases.