
74737.202617
Aside from its sheer intractability, the problem of vagueness has a historical dimension that lends it extra bite. The modern development of logic was primarily concerned to codify the canons of reasoning employed in mathematical proofs. A striking feature of the mathematical realm is that it’s one which is sharp: there’s no such thing as a number which is borderline odd, and the languages of pure mathematics don’t contain vague expressions of other categories; for example, none of Hilbert’s problems use ‘many’ in their formulation. So when we turn to the description of the empirical realm, we cannot avoid the question whether the apparent lack of sharpness requires that accommodations in logic be made. Epistemicism is the view that no accommodations need to be made, since the empirical realm is no less sharp than the mathematical (Sorensen 1988, 2001; Williamson 1994). In particular, if F is a predicate whose application is typically persistent across small changes in some quantity but not persistent across some large ones, then if some such large change is decomposed into a series of small changes, there will be a particular small change which unseats the predicate. On this view, the term ‘vague’ simply marks the epistemic inaccessibility of which small change does the damage.

197272.202663
From the Icosahedron to E8
Here is a little article I’m writing for the Newsletter of the London Mathematical Society. The regular icosahedron is connected to many ‘exceptional objects’ in mathematics, and here I describe two ways of using it to construct One uses a subring of the quaternions called the ‘icosians’, while the other uses Du Val’s work on the resolution of Kleinian singularities. …

200155.202679
Iterated reflection principles have been employed extensively to unfold epistemic commitments that are incurred by accepting a mathematical theory. Recently this has been applied to theories of truth. The idea is to start with a collection of Tarskibiconditionals and arrive by finitely iterated reflection at strong compositional truth theories. In the context of classical logic it is incoherent to adopt an initial truth theory in which A and ‘A is true’ are interderivable. In this article we show how in the context of a weaker logic, which we call Basic De Morgan Logic, we can coherently start with such a fully disquotational truth theory and arrive at a strong compositional truth theory by applying a natural uniform reflection principle a finite number of times.

245481.202694
An interesting new paper by Zylstra attempts to cast doubt on the project of analyzing essence in terms of necessity plus something else. As Fine famously pointed out, it is plausible that the set {Soctrates} essentially contains Socrates but that Socrates does not essentially belong to {Socrates}. …

611530.202707
What is meant by ‘One True Logic’ is sometimes not made entirely clear — what is a logic and what is it for one of them to be true? Since the study of logic involves giving a theory of logical consequence for formal languages, the view must be that there is one true theory of logical consequence. In order for such a logic to be true, it must be capable of correct representation. What do logics represent? It is clear from the various uses of applied logic, they can represent many different sorts of phenomena. But for the purposes of traditional pure logic, though, theories of consequence are frequently taken to represent natural language inference.

670350.202721
Kuhn argued that scientific theory choice is, in some sense, a rational matter, but one that is not fully determined by shared objective scientific virtues like accuracy, simplicity, and scope. Okasha imports Arrow’s impossibility theorem into the context of theory choice to show that rather than not fully determining theory choice, these virtues cannot determine it at all. If Okasha is right, then there is no function (satisfying certain desirable conditions) from ‘preference’ rankings supplied by scientific virtues over competing theories (or models, or hypotheses) to a single allthingsconsidered ranking. This threatens the rationality of science. In this paper we show that if Kuhn’s claims about the role that subjective elements play in theory choice are taken seriously, then the threat dissolves.

721005.202738
Classical higherorder logic, when utilized as a metalogic in which various other (classical and nonclassical) logics can be shallowly embedded, is well suited for realising a universal logic reasoning approach. Universal logic reasoning in turn, as envisioned already by Leibniz, may support the rigorous formalisation and deep logical analysis of rational arguments within machines. A respective universal logic reasoning framework is described and a range of exemplary applications are discussed. In the future, universal logic reasoning in combination with appropriate, controlled forms of rational argumentation may serve as a communication layer between humans and intelligent machines.

721039.202751
Starting from a generalization of the standard axioms for a monoid we present a stepwise development of various, mutually equivalent foundational axiom systems for category theory. Our axiom sets have been formalized in the Isabelle/HOL interactive proof assistant, and this formalization utilizes a semantically correct embedding of free logic in classical higherorder logic. The modeling and formal analysis of our axiom sets has been significantly supported by series of experiments with automated reasoning tools integrated with Isabelle/HOL. We also address the relation of our axiom systems to alternative proposals from the literature, including an axiom set proposed by Freyd and Scedrov for which we reveal a technical issue (when encoded in free logic): either all operations, e.g. morphism composition, are total or their axiom system is inconsistent. The repair for this problem is quite straightforward, however.

945057.202764
This paper presents an alternative to standard dynamic semantics. It uses the strong Kleene connectives to give a unified account of etype anaphora and presupposition projection. The system is more conservative and simple than standard dynamic treatments of these two phenomena, and, I argue, has empirical advantages in its treatment of disjunction and negation.

1181735.202777
The Tarskian notion of truthinamodel is the paradigm formal capture of our pretheoretical notion of truth for semantic purposes. But what exactly makes Tarski’s construction so well suited for semantics is seldom discussed. In Simchen (2017a) I articulate a certain requirement on the successful formal modeling of truth for semantics – “localityperreference” – against a background discussion of metasemantics and its relation to truthconditional semantics. It is a requirement on any formal capture of sentential truth visavis the interpretation of singular terms and it is clearly met by the Tarskian notion. In this paper another such requirement is explored – “localityperapplication” – which is a requirement on a formal capture of sentential truth visavis the interpretation of predicates. This second requirement is also clearly met by the Tarskian notion. The two requirements taken together offer a fuller answer than has been hitherto available to the question of what makes Tarski’s notion of truthinamodel especially well suited for semantics.

1243002.20279
We develop quantifier elimination procedures for a fragment of higher order logic arising from the formalization of distributed systems (especially of faulttolerant ones). Such procedures can be used in symbolic manipulations like the computation of Pre/Post images and of projections. We show in particular that our procedures are quite effective in producing counter abstractions that can be modelchecked using standard SMT technology.

1728963.202804
In Mathematics is megethology (Lewis, 1993) David K. Lewis proposes a structuralist reconstruction of classical set theory based on mereology. In order to formulate suitable hypotheses about the size of the universe of individuals without the help of settheoretical notions, he uses the device of Boolos’ plural quantification for treating second order logic without commitment to settheoretical entities. In this paper we show how, assuming the existence of a pairing function on atoms, as the unique assumption non expressed in a mereological language, a mereological foundation of set theory is achievable within first order logic. Furthermore, we show how a mereological codification of ordered pairs is achievable with a very restricted use of the notion of plurality without plural quantification.

1734569.202817
Suppose someone (P1) does something that is wrongful only in virtue of the risk that it will enable another person (P2) to commit a wrongdoing. Suppose further that P1’s conduct does indeed turn out to enable P2’s wrongdoing. The resulting wrong is agentially mediated: P1 is an enabling agent and P2 is an intervening agent. Whereas the literature on intervening agency focuses on whether P2’s status as an intervening agent makes P1’s conduct less bad, I turn this issue on its head by investigating whether P1’s status as an enabling agent makes P2’s conduct more bad. I argue that it does: P2 wrongs not just the victims of ϕ but P1 as well, by acting in a way that wrongfully makes P1 accountable for ϕ. This has serious implications for compensatory and defensive liability in cases of agentially mediated wrongs.

1746669.202833
Here’s a cute connection between topological entropy, braids, and the golden ratio. I learned about it in this paper:
• JeanLuc Thiffeault and Matthew D. Finn, Topology, braids, and mixing in fluids. …

1804301.202849
J. D. Hamkins and W. Woodin, “The universal finite set,” ArXiv eprints, pp. 116, 2017. (manuscript under review)
Citation arχiv
@ARTICLE{HamkinsWoodin:Theuniversalfiniteset,
author = {Joel David Hamkins and W.~Hugh Woodin},
title = {The universal finite set},
journal = {ArXiv eprints},
year = {2017},
volume = {},
number = {},
pages = {116},
month = {},
note = {manuscript under review},
abstract = {},
keywords = {underreview},
source = {},
doi = {},
eprint = {1711.07952},
archivePrefix = {arXiv},
primaryClass = {math.LO},
url = {http://jdh.hamkins.org/theuniversalfiniteset},
}
Abstract. …

1902146.202863
According to orthodox (Kolmogorovian) probability theory, conditional probabilities are by definition certain ratios of unconditional probabilities. As a result, orthodox conditional probabilities are regarded as undefined whenever their antecedents have zero unconditional probability. This has important ramifications for the notion of probabilistic independence.

1966895.202875
The previous two chapters have sought to show that the probability calculus cannot serve as a universally applicable logic of inductive inference. We may well wonder whether there might be some other calculus of inductive inference that can be applied universally. It would, perhaps, arise through a weakening of the probability calculus. The principal source of difficulty addressed in those chapters was the additivity of the probability calculus. Such a weakening seems possible as far as additivity is concerned. Something like it is achieved with the Shafer Dempster theory of belief functions. However there is a second, lingering problem. Bayesian analyses require prior probabilities. As we shall see below, these prior probabilities are never benign. They always make a difference to the final result.

2082160.202889
This paper considers states on the Weyl algebra of the canonical commutation relations over the phase space R^{2n}. We show that a state is regular iff its classical limit is a countably additive Borel probability measure on R^{2n}. It follows that one can “reduce” the state space of the Weyl algebra by altering the collection of quantum mechanical observables so that all states are ones whose classical limit is physical.

2082918.202902
I subject the semantic claims of stage theory to scrutiny and show that it’s unclear how to make them come out true for a simple and deep reason: the stage theorist needs tensed elements to semantically modify the denotations of referring expressions to enable us to talk about past and future stages. But in the syntax of natural language, expressions carrying tense modify verbs and adjectives and not referring expressions. This mismatch between what the stage theorist needs, and what language provides, makes it hard to see how the stage theorist’s semantic claims could be true.

2553002.202917
The purpose of this paper is to present a paraconsistent formal system and a corresponding intended interpretation according to which true contradictions are not tolerated. Contradictions are, instead, epistemically understood as conflicting evidence, where evidence for a proposition A is understood as reasons for believing that A is true. The paper defines a paraconsistent and paracomplete natural deduction system, called the Basic Logic of Evidence (BLE ), and extends it to the Logic of Evidence and Truth (LETJ). The latter is a logic of formal inconsistency and undeterminedness that is able to express not only preservation of evidence but also preservation of truth. LETJ is antidialetheist in the sense that, according to the intuitive interpretation proposed here, its consequence relation is trivial in the presence of any true contradiction. Adequate semantics and a decision method are presented for both BLE and LETJ , as well as some technical results that fit the intended interpretation.

2564814.202931
Network analysis needs tools to infer distributions over graphs of arbitrary size from a single graph. Assuming the distribution is generated by a continuous latent space model which obeys certain natural symmetry and smoothness properties, we establish three levels of consistency for nonparametric maximum likelihood inference as the number of nodes grows: (i) the estimated locations of all nodes converge in probability on their true locations; (ii) the distribution over locations in the latent space converges on the true distribution; and (iii) the distribution over graphs of arbitrary size converges.

2725826.202943
There are various equivalent formulations of the ChurchTuring thesis. A common one is that every effective computation can be carried out by
a Turing machine. The ChurchTuring thesis is often misunderstood,
particularly in recent writing in the philosophy of mind.

2931440.202956
This paper is about the putative theoretical virtue of strength, as it might be used in abductive arguments to the correct logic in the epistemology of logic. It argues for three theses. The first is that the welldefined property of logical strength is neither a virtue nor a vice, so that logically weaker theories are not—all other things being equal—worse or better theories than logically stronger ones. The second thesis is that logical strength does not entail the looser characteristic of scientific strength, and the third is that many modern logics are on a par—or can be made to be on a par—with respect to scientific strength.

2956362.20297
“Intuitionistic logic” is a term that unfortunately gains
ever greater currency; it conveys a wholly false view on
intuitionistic mathematics. —Freudenthal 1937
Intuitionistic logic is an offshoot of L.E.J. Brouwer’s
intuitionistic mathematics. A widespread misconception has it that
intuitionistic logic is the logic underlying Brouwer’s
intuitionism; instead, the intuitionism underlies the logic, which is
construed as an application of intuitionistic mathematics to language. Intuitionistic mathematics consists in the act of effecting mental
constructions of a certain kind. These are themselves not linguistic
in nature, but when acts of construction and their results are
described in a language, the descriptions may come to exhibit
linguistic patterns.

2956452.202983
The method of explication has been somewhat of a hot topic in the last ten years. Despite the multifaceted research that has been directed at the issue, one may perceive a lack of stepbystep procedural or structural accounts of explication. This paper aims at providing a structural account of the method of explication in continuation of the works of Geo Siegwart. It is enhanced with a detailed terminology for the assessment and comparison of explications. The aim is to provide means to talk about explications including their criticisms and their interrelations. There is hope that this treatment will be able to serve as a foundation to a stepbystep guide to be established for explicators. At least it should help to frame and mediate explicative disputes. In closing the enterprise will be considered an explication of ‘explication’, though consecutive explications improving on this one are undoubtedly conceivable.

2956616.203002
The previous chapter examined the inductive logic applicable to an infinite lottery machine. Such a machine generates a countably infinite set of outcomes, that is, there are as many outcomes as natural numbers, 1, 2, 3, … We found there that, if the lottery machine is to operate without favoring any particular outcome, the inductive logic native to the system is not probabilistic. A countably infinite set is the smallest in the hierarchy of infinities. The next routinely considered is a continuumsized set, such as given by the set of all real numbers or even just by the set of all real numbers in some interval, from, say, 0 to 1.

3112079.203031
Parthood is used widely in ontologies across subject domains. Some modelling guidance can be gleaned from Ontology, yet it offers multiple mereological theories, and even more when combined with topology, i.e., mereotopology. To complicate the landscape, decidable languages put restrictions on the language features, so that only fragments of the mereo(topo)logical theories can be represented, yet during modelling, those full features may be needed to check correctness. We address these issues by specifying a structured network of theories formulated in multiple logics that are glued together by the various linking constructs of the Distributed Ontology Language, DOL. For the KGEMT mereotopological theory and five subtheories, together with the DLbased OWL species and first and secondorder logic, this network in DOL orchestrates 28 ontologies. Further, we propose automated steps toward resolution of language feature conflicts when combining modules, availing of the new ‘OWL classifier’ tool that pinpoints profile violations.

3195529.203051
The aim of this paper is to argue that all—or almost all—logical rules have exceptions. In particular, I argue that this is a moral that we should draw from the semantic paradoxes. Of course, responding to the paradoxes by revising classical logic in some way is familiar: see, e.g., Kripke [1975], Priest [1987/2006], Soames [1999], Maudlin [2004], Field [2008] and Beall [2009]. But such proposals tend to advocate replacing classical logic with some alternative logic. That is, some alternative system of rules—where, of course, it is taken for granted that these alternatives hold without exception.

3525812.203067
The principles of Conditional Excluded Middle (CEM) and Simplification of Disjunctive Antecedents (SDA) have received substantial attention in isolation. Both principles are plausible generalizations about natural language conditionals. There is however little or no discussion of their interaction. This paper aims to remedy this gap and explore the significance of having both principles constrain the logic of the conditional. Our negative finding is that, together with elementary logical assumptions, CEM and SDA yield a variety of implausible consequences. Despite these incompatibility results, we open up a narrow space to satisfy both. We show that, by simultaneously appealing to the alternativeintroducing analysis of disjunction and to the theory of homogeneity presuppositions, we can satisfy both. Furthermore, the theory that validates both principles resembles a recent semantics that is defended by Santorio on independent grounds. The cost of this approach is that it must give up the transitivity of entailment: we suggest that this is a feature, not a bug, and connect it with with recent developments of intransitive notions of entailment.

3649072.20308
I’ve been slacking off on writing this series of posts… but for a good reason: I’ve been busy writing a paper on the same topic! In the process I caught a couple of mistakes in what I’ve said so far. But more importantly, there’s a version out now, that you can read:
• John Baez, John Foley, Blake Pollard and Joseph Moeller, Network models. …