1. 43387.654787
    Our aim in this paper is to extend the semantics for the kind of logic of ground developed in [deRosset and Fine, 2023]. In that paper, the authors very briefly suggested a way of treating universal and existential quantification over a fixed domain of objects. Here we explore some options for extending the treatment to allow for a variable domain of objects.
    Found 12 hours, 3 minutes ago on Louis deRosset's site
  2. 43410.655031
    This paper is concerned with the semantics for the logics of ground that derive from a slight variant GG of the logic of [Fine, 2012b] that have already been developed in [deRosset and Fine, 2023]. Our aim is to outline that semantics and to provide a comparison with two related semantics for ground, given in [Correia, 2017] and [Kramer, 2018a]. This comparison highlights the strengths and difficulties of these different approaches. KEYWORDS: Impure Logic of Ground; Truthmaker Semantics; Logic of Ground; Ground This paper concerns the semantics for the logics of ground deriving from a slight variant GG of the logic of [Fine, 2012b] that have already been developed in [deRosset and Fine, 2023]. Our aim is to outline that semantics and to provide a comparison with two related semantics for ground, given in [Correia, 2017] and [Kramer, 2018a]. This will serve to highlight the strengths and difficulties of these different approaches. In particular, it will show how deRosset and Fine’s approach has a greater degree of flexibility in its ability to acccommodate different extensions of a basic minimal system of ground. We shall assume that the reader is already acquainted with some of the basic work on ground and on the framework of truthmaker semantics. Some background material may be found in [Fine, 2012b, 2017a,b].
    Found 12 hours, 3 minutes ago on Louis deRosset's site
  3. 225140.655046
    We propose a framework for the analysis of choice behaviour when the latter is made explicitly in chronological order. We relate this framework to the traditional choice theoretic setting from which the chronological aspect is absent, and compare it to other frameworks that extend this traditional setting. Then, we use this framework to analyse various models of preference discovery. We characterise, via simple revealed preference tests, several models that differ in terms of (i) the priors that the decision-maker holds about alternatives and (ii) whether the decision-maker chooses period by period or uses her knowledge about future menus to inform her present choices. These results provide novel testable implications for the preference discovery process of myopic and forward-looking agents.
    Found 2 days, 14 hours ago on Nobuyuki Hanaki's site
  4. 256221.655056
    Consumption decisions are partly influenced by values and ideologies. Consumers care about global warming, child labor, fair trade, etc. We develop an axiomatic model of intrinsic values – those that are carriers of meaning in and of themselves – and argue that they often introduce discontinuities near zero. For example, a vegetarian’s preferences would be discontinuous near zero amount of animal meat. We distinguish intrinsic values from instrumental ones, which are means rather than ends and serve as proxies for intrinsic values. We illustrate the relevance of our value-based model in different contexts, including equity concerns and prosocial behavior.
    Found 2 days, 23 hours ago on Itzhak Gilboa's site
  5. 487829.655074
    This picture by Roice Nelson shows a remarkable structure: the hexagonal tiling honeycomb. What is it? Roughly speaking, a honeycomb is a way of filling 3d space with polyhedra. The most symmetrical honeycombs are the ‘regular’ ones. …
    Found 5 days, 15 hours ago on Azimuth
  6. 505076.655083
    This axiomatization parallels the structure of first order logic exactly. It can be read as a reduction of the axiom scheme of comprehension of TST(U) to finitely many axiom templates (up to type assignment) or as a reduction of the axiom scheme of stratified comprehension to finitely many axioms. Probably one should assume weak extensionality: nonempty sets with the same elements are equal.
    Found 5 days, 20 hours ago on M. Randall Holmes's site
  7. 564135.655091
    I want to comment on an old objection to the “similarity analysis” of counterfactuals, and on a more recent, but related, argument for counterfactual skepticism. According to the similarity analysis, a counterfactual ? > ? is true iff ? is true at all ? worlds that are most similar, in certain respects, to the actual world. The old objection that I have in mind is that the similarity analysis fails to validate Simplification of Disjunctive Antecedents (SDA), the inference from (? ∨ ?) > ? to ? > ? and ? > ?. Imagine someone utters (1a) on a hot summer day.
    Found 6 days, 12 hours ago on Wolfgang Schwarz's site
  8. 571333.655099
    Jc Beall’s Divine Contradiction is a fascinating defence of the idea that contradictions are true of the tri-personal God. This project requires a logic that avoids the consequence that every proposition follows from a contradiction. Beall presents such a logic. This ‘gap/glut’ logic is the topic of this article. A gap/glut logic presupposes that falsity is not simply the absence of truth – for a proposition that is true may also be false. This article is essentially an examination of the idea that falsity is not simply untruth. The author rejects this position but does not claim to have an argument against it. In lieu of an argument, he presents three ‘considerations’. First, it is possible to give an intuitive semantics for the language of sentential logic that yields ‘classical’ sentential logic (including ‘p, ¬ p ⊢ q’) and which makes no mention of truth-values. Second, it is possible to imagine a race who manage their affairs very well without having the concept ‘falsity’. Third, it is possible to construct a semantics that yields a logic identical with the dialetheist logic and which makes no mention of truth-values – and which, far from being plausible, seems pointless.
    Found 6 days, 14 hours ago on Peter van Inwagen's site
  9. 626216.655107
    This analysis shows Cantor's diagonal definition in his 1891 paper was not compatible with his horizontal enumeration of the infinite set M. The diagonal sequence was a counterfeit which he used to produce an apparent exclusion of a single sequence to prove the cardinality of M is greater than the cardinality of the set of integers N.
    Found 1 week ago on PhilSci Archive
  10. 788979.655115
    I propose an approach to liar and Curry paradoxes inspired by the work of Roger Swyneshed in his treatise on insolubles (1330-1335). The keystone of the account is the idea that liar sentences and their ilk are false (and only false) and that the so-called “capture” direction of the T-schema should be restricted. The proposed account retains what I take to be the attractive features of Swyneshed’s approach without leading to some worrying consequences Swyneshed accepts. The approach and the resulting logic (called “Swynish Logic”) are non-classical, but are consistent and compatible with many elements of the classical picture including modus ponens, modus tollens, and double-negation elimination and introduction. It is also compatible with bivalence and contravalence. My approach to these paradoxes is also immune to an important kind of revenge challenge that plagues some of its rivals.
    Found 1 week, 2 days ago on PhilPapers
  11. 845458.655122
    Throughout the history of automated reasoning, mathematics has been viewed as a prototypical domain of application. It is therefore surprising that the technology has had almost no impact on mathematics to date and plays almost no role in the subject today. This article presents an optimistic view that the situation is about to change. It describes some recent developments in the Lean programming language and proof assistant that support this optimism, and it reflects on the role that automated reasoning can and should play in mathematics in the years to come.
    Found 1 week, 2 days ago on Jeremy Avigad's site
  12. 907636.65513
    Suppose that we have n objects α1, ..., αn, and we want to define something like numerical values (at least hyperreal ones, if we can’t have real ones) on the basis of comparisons of value. Here is one interesting way to proceed. …
    Found 1 week, 3 days ago on Alexander Pruss's Blog
  13. 1145461.655139
    We show that knowledge satisfies interpersonal independence, meaning that a non-trivial sentence describing one agent’s knowledge cannot be equivalent to a sentence describing another agent’s knowledge. The same property of interpersonal independence holds, mutatis mutandis, for belief. In the case of knowledge, interpersonal independence is implied by the fact that there are no non-trivial sentences that are common knowledge in every model of knowledge. In the case of belief, interpersonal independence follows from a strong interpersonal independence that knowledge does not have. Specifically, there is no sentence describing the beliefs of one person that implies a sentence describing the beliefs of another person.
    Found 1 week, 6 days ago on PhilSci Archive
  14. 1184729.655147
    Davide Grossi Artificial Intelligence, Bernoulli Institute, University of Groningen ILLC/ACLE, University of Amsterdam The Netherlands d.grossi@rug.nl its application varies in complexity and depends, in particular, on whether relevant past decisions agree, or exist at all. The contribution of this paper is a formal treatment of types of the hardness of case-based decisions. The typology of hardness is defined in terms of the arguments for and against the issue to be decided, and their kind of validity (conclusive, presumptive, coherent, incoherent). We apply the typology of hardness to Berman and Hafner’s research on the dynamics of case-based reasoning and show formally how the hardness of decisions varies with time.
    Found 1 week, 6 days ago on Davide Grossi's site
  15. 1197812.655155
    This paper examines the logic of conditional obligation, which originates from the works of Hansson, Lewis, and others. Some weakened forms of transitivity of the betterness relation are studied. These are quasi-transitivity, Suzumura consistency, acyclicity and the interval order condition. The first three do not change the logic. The axiomatic system is the same whether or not they are introduced. This holds true under a rule of interpretation in terms of maximality and strong maximality. The interval order condition gives rise to a new axiom. Depending on the rule of interpretation, this one changes. With the rule of maximality, one obtains the principle known as disjunctive rationality. With the rule of strong maximality, one obtains the Spohn axiom (also known as the principle of rational monotony, or Lewis’ axiom CV). A completeness theorem further substantiates these observations. For interval order, this yields the finite model property and decidability of the calculus.
    Found 1 week, 6 days ago on X. Parent's site
  16. 1251100.655163
    Transitivity, Simplification, and Contraposition are intuitively compelling. Although Antecedent Strengthening may seem less attractive at first, close attention to the full range of data reveals that it too has considerable appeal. An adequate theory of conditionals should account for these facts. The strict theory of conditionals does so by validating the four inferences. It says that natural language conditionals are necessitated material conditionals: A B is true if and only if A B is true throughout a set of accessible worlds. As a result, it validates many classical inferences, including Transitivity, Simplification, Contraposition, and Antecedent Strengthening. In what follows I will refer to these as the strict inferences.
    Found 2 weeks ago on PhilPapers
  17. 1251198.65517
    Let serious propositional contingentism (SPC) be the package of views which consists in (i) the thesis that propositions expressed by sentences featuring terms depend, for their existence, on the existence of the referents of those terms, (ii) serious actualism— the view that it is impossible for an object to exemplify a property and not exist—and (iii) contingentism—the view that it is at least possible that some thing might not have been something. SPC is popular and compelling. But what should we say about possible worlds, if we accept SPC? Here, I first show that a natural view of possible worlds, well-represented in the literature, in conjunction with SPC is inadequate. Though I note various alternative ways of thinking about possible worlds in response to the first problem, I then outline a second more general problem—a master argument— which generally shows that any account of possible worlds meeting very minimal requirements will be inconsistent with compelling claims about mere possibilia which the serious propositional contingentist should accept.
    Found 2 weeks ago on PhilPapers
  18. 1363633.655178
    As usually presented, octagons of opposition are rather complex objects and can be difficult to assimilate at a glance. We show how, under suitable conditions that are satisfied by most historical examples, different display conventions can simplify the diagrams, making them easier for readers to grasp without the loss of information. Moreover, those conditions help reveal the conceptual structure behind the visual display.
    Found 2 weeks, 1 day ago on David Makinson's site
  19. 1492350.655188
    We extend a result by Gallow concerning the impossibility of following two epistemic masters, so that it covers a larger class of pooling methods. We also investigate a few ways of avoiding the issue, such as using nonconvex pooling methods, employing the notion of imperfect trust or moving to higher-order probability spaces. Along the way we suggest a conceptual issue with the conditions used by Gallow: whenever two experts are considered, whether we can trust one of them is decided by the features of the other!
    Found 2 weeks, 3 days ago on PhilSci Archive
  20. 1896169.655196
    The simulation hypothesis has recently excited renewed interest, especially in the physics and philosophy communities. However, the hypothesis specifically concerns computers that simulate physical universes, which means that to properly investigate it we need to couple computer science theory with physics. Here I do this by exploiting the physical Church-Turing thesis. This allows me to introduce a preliminary investigation of some of the computer science theoretic aspects of the simulation hypothesis. In particular, building on Kleene’s second recursion theorem, I prove that it is mathematically possible for us to be in a simulation that is being run on a computer by us. In such a case, there would be two identical instances of us; the question of which of those is “really us” is meaningless. I also show how Rice’s theorem provides some interesting impossibility results concerning simulation and self-simulation; briefly describe the philosophical implications of fully homomorphic encryption for (self-)simulation; briefly investigate the graphical structure of universes simulating universes simulating universes, among other issues. I end by describing some of the possible avenues for future research that this preliminary investigation reveals.
    Found 3 weeks ago on PhilSci Archive
  21. 2011538.655205
    Within the context of general relativity, Leibnizian metaphysics seems to demand that worlds are “maximal” with respect to a variety of space-time properties (Geroch 1970; Earman 1995). Here, we explore maximal worlds with respect to the “Heraclitus” asymmetry property which demands that of no pair of spacetime events have the same structure (Manchak and Barrett 2023). First, we show that Heraclitus-maximal worlds exist and that every Heraclitus world is contained in some Heraclitus-maximal world. This amounts to a type of compatibility between the Leibnizian and Heraclitian demands. Next, we consider the notion of “observationally indistinguishable” worlds (Glymour 1972, 1977; Malament 1977). We know that, modulo modest assumptions, any world is observationally indistinguishable from some other (non-isomorphic) world (Manchak 2009). But here we show a way out of this general epistemic predicament: if attention is restricted to Heraclitus-maximal worlds, then worlds are observationally indistinguishable if and only if they are isomorphic. Finally, we show a sense in which cosmic underdetermination can still arise for individual observers even if the Leibnizian and Heraclitian demands are met.
    Found 3 weeks, 2 days ago on PhilSci Archive
  22. 2184715.655213
    This paper examines different kinds of definite descriptions denoting purely contingent, necessary or impossible objects. The discourse about contingent/impossible/necessary objects can be organised in terms of rational questions to ask and answer relative to the modal profile of the entity in question. There are also limits on what it is rational to know about entities with this or that modal profile. We will also examine epistemic modalities; they are the kind of necessity and possibility that is determined by epistemic constraints related to knowledge or rationality. Definite descriptions denote so-called offices, roles, or things to be. We explicate these -offices as partial functions from possible worlds to chronologies of objects of type , where  is mostly the type of individuals. Our starting point is Prior’s distinction between a ‘weak’ and ‘strong’ definite article ‘the’. In both cases, the definite description refers to at most one object; yet, in the case of the weak ‘the’, the referred object can change over time, while in the case of the strong ‘the’, the object referred to by the definite description is the same forever, once the office has been occupied. The main result we present is the way how to obtain a Wh-knowledge about who or what plays a given role presented by a hyper-office, i.e. procedure producing an office. Another no less important result concerns the epistemic necessity of the impossibility of knowing who or what occupies the impossible office presented by a hyper-office.
    Found 3 weeks, 4 days ago on PhilSci Archive
  23. 2201322.655221
    First, we suggest and discuss second-order versions of properties for solutions for TU games used to characterize the Banzhaf value, in particular, of standardness for two-player games, of the dummy player property, and of 2-efficiency. Then, we provide a number of characterizations of the Banzhaf value invoking the following properties: (i) [second-order standardness for two-player games or the second-order dummy player property] and 2-efficiency, (ii) standardness for one-player games, standardness for two-player games, and second-order 2-efficiency, (iii) standardness for one-player games, [second-order standardness for two-player games or the second-order dummy player property], and second-order 2-efficiency. These characterizations also work within the classes of simple games, of superadditive games, and of simple superadditive games.
    Found 3 weeks, 4 days ago on André Casajus's site
  24. 2588286.655229
    Reconstructions of quantum theory are a novel research program in theoretical physics which aims to uncover the unique physical features of quantum theory via axiomatization. I focus on Hardy’s “Quantum Theory from Five Reasonable Axioms” (2001), arguing that reconstructions represent a modern usage of axiomatization with significant points of continuity to von Neumann’s axiomatizations in quantum mechanics. In particular, I show that Hardy and von Neumann share similar methodological ordering, have a common operational framing, and insist on the empirical basis of axioms. In the reconstruction programme, interesting points of discontinuity with historical axiomatizations include the stipulation of a generalized space of theories represented by a framework and the stipulation of analytic machinery at two levels of generality (first by establishing a generalized mathematical framework and then by positing specific formulations of axioms). In light of the reconstruction programme, I show that we should understand axiomatization attempts as being context–dependent, context which is contingent upon the goals of inquiry and the maturity of both mathematical formalism and theoretical underpinnings within the area of inquiry. Drawing on Mitsch (2022)’s account of axiomatization, I conclude that reconstructions should best be understood as provisional, practical, representations of quantum theory that are well suited for theory development and exploration. However, I propose my context–dependent re–framing of axiomatization as a means of enriching Mitsch’s account.
    Found 4 weeks, 1 day ago on PhilSci Archive
  25. 3155586.655236
    A drawback of the standard modal ontological proof is that it assumes that it is possible that there is something godlike. Kurt Gödel’s ontological proof seeks to establish this possibility with the help of certain axiological principles. But the axiological principles he relies on are not very plausible. And the same goes for other Gödelian ontological proofs in the literature. In this paper, I put forward a Gödelian ontological proof that only relies on plausible axiological principles. And I adopt the proof both for constant and variable domains. Nevertheless, the proof still needs the axiom that being godlike is positive in the sense of being a “purely good”-making property.
    Found 1 month ago on Johan E. Gustafsson's site
  26. 3316186.655245
    This paper investigates the conditions under which diagonal sentences can be taken to constitute paradigmatic cases of self-reference. We put forward well-motivated constraints on the diagonal operator and the coding apparatus which separate paradigmatic self-referential sentences, for instance obtained via Gödel’s diagonalization method, from accidental diagonal sentences. In particular, we show that these constraints successfully exclude refutable Henkin sentences, as constructed by Kreisel.
    Found 1 month, 1 week ago on Volker Halbach's site
  27. 3316254.655253
    We introduce and analyze a new axiomatic theory CD of truth. The primitive truth predicate can be applied to sentences containing the truth predicate. The theory is thoroughly classical in the sense that CD is not only formulated in classical logic, but that the axiomatized notion of truth itself is classical: The truth predicate commutes with all quantifiers and connectives, and thus the theory proves that there are no truth value gaps or gluts. To avoid inconsistency, the instances of the T-schema are restricted to determinate sentences. Determinateness is introduced as a further primitive predicate and axiomatized. The semantics and proof theory of CD are analyzed.
    Found 1 month, 1 week ago on Volker Halbach's site
  28. 3907793.655264
    Following Post program, we will propose a linguistic and empirical interpretation of G¨odel’s incompleteness theorem and related ones on unsolvability by Church and Turing. All these theorems use the diagonal argument by Cantor in order to find limitations in finitary systems, as human language, which can make “infinite use of finite means”. The linguistic version of the incompleteness theorem says that every Turing complete language is G¨odel incomplete. We conclude that the incompleteness and unsolvability theorems find limitations in our finitary tool, which is our complete language.
    Found 1 month, 2 weeks ago on PhilPapers
  29. 4034858.655272
    We describe a linear time algorithm that determines all “two-vertex bottlenecks” in a directed graph. This gives all pairs of vertices that disconnect two given nodes s and t in a directed graph. There may be quadratically many two-vertex bottlenecks, but a compressed representation allows them to all be determined in linear time. Applications include the determination of Dual Implication Points (DIPs) in the CDCL solver conflict graph, as discussed in Buss, Chung, Ganesh, and Oliveras [preprint, 2024]. The algorithm for finding all DIPs is an algorithm for Menger’s Theorem on a directed graph that not only verifies that two given vertices are not 3-connected but also finds all possible separating vertex pairs.
    Found 1 month, 2 weeks ago on Samuel R. Buss's site
  30. 4196476.65528
    This paper is a discussion note on Isaacs et al. 2022, who have claimed to offer a new motivation for imprecise probabilities, based on the mathematical phenomenon of nonmeasurability. In this note, I clarify some consequences of their proposal. In particular, I show that if their proposal is applied to a bounded 3-dimensional space, then they have to reject at least one of the following: • If A is at most as probable as B and B is at most as probable as C, then A is at most as probable as C.
    Found 1 month, 2 weeks ago on PhilPapers