
59790.60727
The contest for leadership of the Scottish Labour party has reopened an old debate: is it acceptable for egalitarians to send their children to private school? One candidate, Anas Sarwar, has come under criticism for sending his son to the £8,000 a year Hutchesons’ Grammar school in Glasgow. …

59813.607321
J. D. Hamkins and J. Reitz, “The settheoretic universe $V$ is not necessarily a classforcing extension of HOD,” ArXiv eprints, 2017. (manuscript under review)
Citation arχiv
@ARTICLE{HamkinsReitz:ThesettheoreticuniverseisnotnecessarilyaforcingextensionofHOD,
author = {Joel David Hamkins and Jonas Reitz},
title = {The settheoretic universe $V$ is not necessarily a classforcing extension of HOD},
journal = {ArXiv eprints},
year = {2017},
volume = {},
number = {},
pages = {},
month = {September},
note = {manuscript under review},
abstract = {},
keywords = {},
source = {},
doi = {},
eprint = {1709.06062},
archivePrefix = {arXiv},
primaryClass = {math.LO},
url = {http://jdh.hamkins.org/theuniverseneednotbeaclassforcingextensionofhod},
}
Abstract. …

251591.607338
An action is something that takes place in the world, and that makes a difference to what the world looks like. Thus, actions are maps from states of the world to new states of the world. Actions can be of various kinds. The action of spilling coffee changes the state of your trousers. The action of telling a lie to your friend changes your friends state of mind (and maybe the state of your soul). The action of multiplying two numbers changes the state of certain registers in your computer. Despite the differences between these various kinds of actions, we will see that they can all be covered under the same logical umbrella.

331492.607353
In this paper, we study the conditions under which existence of interpolants (for quantifierfree formulae) is modular, in the sense that it can be transferred from two firstorder theories T1, T2 to their combination T1∪T2. We generalize to the nondisjoint signatures case the results from [3]. As a surprising application, we relate the Horn combinability criterion of this paper to superamalgamability conditions known from propositional logic and we use this fact to derive old and new results concerning fusions transfer of interpolation properties in modal logic.

331507.607366
Within the last few years there has been some interest in investigating the relationship between the truthlikeness (verisimilitude) and belief revision programs [2, 6] . One prominent result of this investigation is that given any plausible account of truthlikeness and rational account of belief revision, expansions (+) and revisions (*) of a database (or belief state) D with true input A are not guaranteed to increase the database’s truthlikeness. D here is a belief set (i.e. D = Cn(D)) and D stands for its propositional formula representation.

331701.607381
Kant said that we were never be able to know about the true nature of matter. The things in themselves would remain unknown to us. There is a similar problem in quantum mechanics. You cannot provide directly any property to a physical state represented by a ray in a Hilbert space. The general theory of relativity teaches time and space were not how they appear to us, but claims to know that in fact space and time would belong to a curved spacetime. It turned out in the last decades that it is extraordinary difficulty to combine both theories. Based on quantum mechanics I argue in this paper that the things in themselves remain unknown. There is probably no substance which we can call spacetime.

331986.607394
Wave function realism is an interpretational framework for quantum theories that has been defended for its ability to provide a clear and natural metaphysics for quantum theories, one that is fundamentally both separable and local. This is in contrast to competitor primitive ontology frameworks that while they could be separable, are not local, and holist or structuralist approaches that while they could be local, are not separable. The claim that wave function realist metaphysics is local, however, is not as straightforward as it has sometimes been assumed to be (nor as straightforward as the sense in which wave function realist metaphysics are separable). This paper distinguishes different senses in which a metaphysics for physics may be local, what may be the virtues of a metaphysics local in these senses, and the capacity of wave function realism to deliver such a metaphysics.

366934.607408
In this tutorial, the meaning of natural language is analysed along the lines proposed by Gottlob Frege and Richard Montague. In building meaning representations, we assume that the meaning of a complex expression derives from the meanings of its components. Typed logic is a convenient tool to make this process of composition explicit. Typed logic allows for the building of semantic representations for formal languages and fragments of natural language in a compositional way. The tutorial ends with the discussion of an example fragment, implemented in the functional programming language Haskell Haskell Team; Jones [2003].

432600.607421
David Lewis’“Principal Principle”is a purported principle of rationality connecting credence and objective chance. Almost all of the discussion of the Principal Principle in the philosophical literature assumes classical probability theory, which is unfortunate since the theory of modern physics that, arguably, speaks most clearly of objective chance is the quantum theory, and quantum probabilities are not classical probabilities. This paper develops an account of how chance works in quantum theory that reveals a connection between credence and quantum chance quite unlike what is envisioned in the philosophical literature: as a theorem of quantum probability, updating a completely additive chance function on a knowledge of chance brings credence into line with chance. The account also suggests a way of construing the Humean supervenience of chance that has the virtue of dissolving some puzzles about the “undermining” of chances. A number of interpretative moves in quantum theory are needed to generate the account of quantum chance on offer here, and they can all be disputed. But engaging in these disputes is part and parcel of naturalized metaphysics, and as such it can be more productive than engaging in the battle of intuitions among analytical metaphysicians about how chance ought to work this and other possible worlds.

488425.607435
Last weekend, I gave a talk on big numbers, as well as a Q&A about quantum computing, at Festivaletteratura: one of the main European literary festivals, held every year in beautiful and historic Mantua, Italy. …

496707.607448
C. S. Peirce: 10 Sept, 183919 April, 1914
Sunday, September 10, was C.S. Peirce’s birthday. He’s one of my heroes. He’s a treasure chest on essentially any topic, and anticipated quite a lot in statistics and logic. …

538844.607463
Parthood in mereology is one relation, and typically is included in foundational ontologies. Some of these foundational ontologies and many domain ontologies use a plethora of parthood and partwhole relations, such as ‘sub process’ and ‘portion’. This poses requirements on the foundational ontologies and, perhaps, Ontology, on what to do with these two different approaches to partwhole relations. We present an analysis of DOLCE, BFO, GFO, SUMO, GIST, and YAMATO on their inclusion and use of partwhole relations. It demonstrates there is no perfect fit with either for various reasons. We then aim to bridge this gap with an orchestration of ontologies of partwhole relations that are aligned to several foundational ontologies and such that they can be imported into other ontologies.

547904.607476
By “alternative set theories” we mean systems of set
theory differing significantly from the dominant ZF
(ZermeloFrankel set theory) and its close relatives (though we will
review these systems in the article). Among the systems we will review
are typed theories of sets, Zermelo set theory and its variations, New
Foundations and related systems, positive set theories, and
constructive set theories. An interest in the range of alternative set
theories does not presuppose an interest in replacing the dominant set
theory with one of the alternatives; acquainting ourselves with
foundations of mathematics formulated in terms of an alternative
system can be instructive as showing us what any set theory (including
the usual one) is supposed to do for us.

547959.607489
We would like to arrive at a single coherent pair of credences in X and X. Perhaps we wish to use these to set our own credences; or perhaps we wish to publish them in a report of the WHO as the collective view of expert epidemiologists; or perhaps we wish to use them in a decisionmaking process to determine how medical research funding should be allocated in 2018. Given their expertise, we would like to use Amira’s and Benito’s credences when we are coming up with ours. However, there are two problems. First, Amira and Benito disagree — they assign different credences to X and different credences to X. Second, Amira and Benito are incoherent — they each assign credences to X and X that do not sum to 1. How, then, are we to proceed? There are natural ways to aggregate different credence functions; and there are natural ways to fix incoherent credence functions. Thus, we might fix Amira and Benito first and then aggregate the fixes; or we might aggregate their credences first and then fix up the aggregate. But what if these two disagree, as we will see they are sometimes wont to do? Which should we choose? To complicate matters further, there is a natural way to do both at once — it makes credences coherent and aggregates them all at the same time. What if this onestep procedure disagrees with one or other or both of the twostep procedures, fixthenaggregate and aggregatethenfix? In what follows, I explore when such disagreement arises and what the conditions are that guarantee that they will not. Then I will explain how these results may be used in philosophical arguments. I begin, however, with an overview of the paper.

547979.607502
All Bayesian epistemologists agree on two claims. The first, which we might call Precise Credences, says that an agent’s doxastic state at a given time t in her epistemic life can be represented by a single credence function Pt, which assigns to each proposition A about which she has an opinion a precise numerical value P_{t}(A) that is at least 0 and at most 1. P_{t}(A) is the agent’s credence in A at t. It measures how strongly she believes A at t, or how confident she is at t that A is true. The second point of agreement, which is typically known as Probabilism, says that an agent’s credence function at a given time should be a probability function: that is, for all times t, Pt(>) = 1 for any tautology >, Pt(⊥) = 0 for any contradiction ⊥, and P_{t}(A ∨ B) = P_{t}(A) + P_{t}(B) − P_{t}(AB) for any propositions A and B.

547993.607515
Over many years, Aharonov and coauthors have proposed a new interpretation of quantum mechanics: the twotime interpretation. This interpretation assigns two wavefunctions to a system, one of which propagates forwards in time and the other backwards. In this paper, I argue that this interpretation does not solve the measurement problem. In addition, I argue that it is neither necessary nor sufficient to attribute causal power to the backwardsevolving wavefunction hΦ and thus its existence should be denied, contra the twotime interpretation. Finally, I follow Vaidman in giving an epistemological reading of hΦ.

601453.607529
Quite often verification tasks for distributed systems are accomplished via counter abstractions. Such abstractions can sometimes be justified via simulations and bisimulations. In this work, we supply logical foundations to this practice, by a specifically designed technique for second order quantifier elimination. Our method, once applied to specifications of verification problems for parameterized distributed systems, produces integer variables systems that are ready to be modelchecked by current SMTbased tools. We demonstrate the feasibility of the approach with a prototype implementation and first experiments.

662155.607542
In his (2010), Roger White argued for a principle of indifference. Hart and Titelbaum (2015) showed that White’s argument relied on an intuition about conditioning on biconditionals which, while widely shared, is incorrect. In their (2017), Hawthorne, Landes, Wallmann, and Williamson argue for a principle of indifference. Remarkably, their argument relies on the same faulty intuition. We explain their intuition, explain why it’s faulty, and show how it generates their principle of indifference.

669663.607562
The previous posts were quite raw and had me wrestling with new data. In this post, I try to be clearer and more accessible, and give a first outline of a new account of necessity that has emerged from my research on these topics. …

839410.607576
In a series of posts a few months ago (here, here, and here), I explored a particular method by which we might aggregate expert credences when those credences are incoherent. The result was this paper, which is now forthcoming in Synthese. …

899331.607589
This paper describes a decision procedure for disjunctions of conjunctions of antiprenex normal forms of pure firstorder logic (FOLDNFs) that do not contain ∨ within the scope of quantifiers. The disjuncts of these FOLDNFs are equivalent to prenex normal forms whose quantifierfree parts are conjunctions of atomic and negated atomic formulae (= Herbrand formulae). In contrast to the usual algorithms for Herbrand formulae, neither skolemization nor unification algorithms with function symbols are applied. Instead, a procedure is described that rests on nothing but equivalence transformations within pure firstorder logic (FOL). This procedure involves the application of a calculus for negative normal forms (the NNFcalculus) with A ⊣⊢ A ∧ A (= ∧I) as the sole rule that increases the complexity of given FOLDNFs. The described algorithm illustrates how, in the case of Herbrand formulae, decision problems can be solved through a systematic search for proofs that reduce the number of applications of the rule ∧I to a minimum in the NNFcalculus. In the case of Herbrand formulae, it is even possible to entirely abstain from applying ∧I. Finally, it is shown how the described procedure can be used within an optimized general search for proofs of contradiction and what kind of questions arise for a ∧Iminimal proof strategy in the case of a general search for proofs of contradiction.

941285.607602
A mainstay assumption in naturallanguage semantics is that if  clauses bind indexical argumentplaces in thenclauses. Unfortunately, recent work (compare Santorio 2012) suggests that if clauses can somehow act to ‘shift the context’. On the framework of Kaplan’s ‘Demonstratives’ (Kaplan 1977), that would be ‘monstrous’ and somehow impossible ‘in English’. The superseding framework of Lewis’s ‘Index, context, and content’ (Lewis 1980) instead maintains that an indexical argumentplace is just one that is bindable (compare Stalnaker 2014, ch. 1), but maintains that these are rare—whereas the lesson of recent work is that they are pervasive.

948339.607615
This paper reveals two fallacies in Turing’s undecidability proof of firstorder logic (FOL), namely, (i) an “extensional fallacy”: from the fact that a sentence is an instance of a provable FOL formula, it is inferred that a meaningful sentence is proven, and (ii) a “fallacy of substitution”: from the fact that a sentence is an instance of a provable FOL formula, it is inferred that a true sentence is proven. The first fallacy erroneously suggests that Turing’s proof of the nonexistence of a circlefree machine that decides whether an arbitrary machine is circular proves a significant proposition. The second fallacy suggests that FOL is undecidable.

1006788.607628
Cumming (2008) argues that his Masked Ball problem undermines Millianism, and that we must instead treat names as variables. However, although the Masked Ball does pose a problem for the Millian given a standard view about the meaning of ‘believes’, that view faces difficulties for independent reasons. I develop a novel “neoKaplanian” attitude semantics to address this problem, and go on to show that with this alternative semantics in hand, the Millian is quite capable of accounting for the Masked Ball.

1056467.607643
Traditional definitions of lying involve at least two necessary conditions: a speaker lies only if she (i) asserts that p and (ii) believes that p is false. Given a fullbelief framework, an adequate account of lying should distinguish mere insincerity (asserting what you don’t believe) from lying (asserting what you believe to be false). An account of lying in terms of a credenceaccuracy framework rather than the traditional fullbelief framework also ought to distinguish these layers. For we find it worse, and objectionably so, for a speaker to assert a proposition which she regards as highly likely to be false, than to assert one in which she has middling credence of, say, 0.5; and we find the latter worse than asserting a proposition in which one has high credence.

1056567.607656
Philosophers of science since Nagel have been interested in the links between intertheoretic reduction and explanation, understanding and other forms of epistemic progress. Although intertheoretic reduction is widely agreed to occur in pure mathematics as well as empirical science, the relationship between reduction and explanation in the mathematical setting has rarely been investigated in a similarly serious way. This paper examines an important and wellknown case: the reduction of arithmetic to set theory. I claim that the reduction is unexplanatory. In defense of this claim, I offer some evidence from mathematical practice, and I respond to contrary suggestions due to Steinhart, Maddy, Kitcher and Quine. I then show how, even if settheoretic reductions are generally not explanatory, set theory can nevertheless serve as a legitimate and successful foundation for mathematics. Finally, some implications of my thesis for philosophy of mathematics and philosophy of science are discussed. In particular, I suggest that some reductions in mathematics are probably explanatory, and I propose that differing standards of theory acceptance might account for the apparent lack of unexplanatory reductions in the empirical sciences. [I]n the philosophy of science the notions of explanation and reduction have been extensively discussed, even in formal frameworks, but there exist few successful and exact applications of the notions to actual theories, and, furthermore, any two philosophers of science seem to think differently about the question of how the notions should be reconstructed. On the other hand, philosophers of mathematics and mathematicians have been successful in defining and applying various exact notions of reduction (or interpretation), but they have not seriously studied the questions of explanation and understanding.

1056644.607669
Much recent work on mathematical explanation has presupposed that the phenomenon involves explanatory proofs in an essential way. I argue that this view, ‘proof chauvinism’, is false. I then look in some detail at the explanation of the solvability of polynomial equations provided by Galois theory, which has often been thought to revolve around an explanatory proof. The paper concludes with some general worries about the effects of chauvinism on the theory of mathematical explanation. Near the beginning of ‘Mathematical Explanation’—the founding document of the recent literature on the subject—Mark Steiner writes: ‘Mathematical explanation exists. Mathematicians routinely distinguish proofs that merely demonstrate from proofs which explain’ ([Steiner 1978a], p. 135). Judging from his treatment of the subject in the rest of the paper, Steiner seems to intend the second claim as something like an elaboration of the first, rather than as an example of one sort of mathematical explanation among others. That is, Steiner appears to endorse approximately the following view: Proof Chauvinism: All or most cases of mathematical explanation involve explanatory proofs in an essential way.

1121462.607683
“Curry’s paradox”, as the term is used by
philosophers today, refers to a wide variety of paradoxes of
selfreference or circularity
that trace their modern
ancestry to Curry (1942b) and Löb
(1955) .^{[ 1 ]}
The common characteristic of these socalled Curry paradoxes is the
way they exploit a notion of implication, entailment or consequence,
either in the form of a connective or in the form of a predicate. Curry’s paradox arises in a number of different domains. Like
Russell’s paradox, it can take
the form of a paradox of
set theory or the theory of properties. But it can also take the form
of a semantic paradox, closely akin to the
Liar paradox.

1235382.607696
The divine attributes of omniscience and omnipotence have faced objections to their very consistency. Such objections rely on reasoning parallel to the semantic paradoxes such as the Liar or the settheoretic paradoxes like Russell’s paradox. With the advent of paraconsistent logics, dialetheism — the view that some contradictions are true — became a major player in the search for a solution to such paradoxes. This paper explores whether dialetheism, armed with the tools of paraconsistent logics, has the resources to respond to the objections levelled against the divine attributes.

1352244.607741
When we design a complex system, we often start with a rough outline and fill in details later, one step at a time. And if the system is supposed to be adaptive, these details may need to changed as the system is actually being used! …