1. 29337.298312
    The news these days feels apocalyptic to me—as if we’re living through, if not the last days of humanity, then surely the last days of liberal democracy on earth. All the more reason to ignore all of that, then, and blog instead about the notorious Busy Beaver function! …
    Found 8 hours, 8 minutes ago on Scott Aaronson's blog
  2. 528813.298505
    We define a notion of inaccessibility of a decision between two options represented by utility functions, where the decision is based on the order of the expected values of the two utility functions. The inaccessibility expresses that the decision cannot be obtained if the expectation values of the utility functions are calculated using the conditional probability defined by a prior and by partial evidence about the probability that determines the decision. Examples of inaccessible decisions are given in finite probability spaces. Open questions and conjectures about inaccessibility of decisions are formulated. The results are interpreted as showing the crucial role of priors in Bayesian taming of epistemic uncertainties about probabilities that determine decisions based on utility maximizing.
    Found 6 days, 2 hours ago on PhilSci Archive
  3. 638617.298519
    We report on the mechanization of (preference-based) conditional normative reasoning. Our focus is on ˚Aqvist’s system E for conditional obligation, and its extensions. Our mechanization is achieved via a shallow semantical embedding in Isabelle/HOL. We consider two possible uses of the framework. The first one is as a tool for meta-reasoning about the considered logic. We employ it for the automated verification of deontic correspondences (broadly conceived) and related matters, analogous to what has been previously achieved for the modal logic cube. The equivalence is automatically verified in one direction, leading from the property to the axiom. The second use is as a tool for assessing ethical arguments. We provide a computer encoding of a well-known paradox (or impossibility theorem) in population ethics, Parfit’s repugnant conclusion. While some have proposed overcoming the impossibility theorem by abandoning the presupposed transitivity of “better than,” our formalisation unveils a less extreme approach, suggesting among other things the option of weakening transitivity suitably rather than discarding it entirely. Whether the presented encoding increases or decreases the attractiveness and persuasiveness of the repugnant conclusion is a question we would like to pass on to philosophy and ethics.
    Found 1 week ago on X. Parent's site
  4. 702029.298526
    When is it explanatorily better to adopt a conjunction of explanatory hypotheses as opposed to committing to only some of them? Although conjunctive explanations are inevitably less probable than less committed alternatives, we argue that the answer is not ‘never’. This paper provides an account of the conditions under which explanatory considerations warrant a preference for less probable, conjunctive explanations. After setting out four formal conditions that must be met by such an account, we consider the shortcomings of several approaches. We develop an account that avoids these shortcomings and then defend it by applying it to a well-known example of explanatory reasoning in contemporary science.
    Found 1 week, 1 day ago on PhilSci Archive
  5. 759801.298532
    Standard textbooks on quantum mechanics present the theory in terms of Hilbert spaces over the …eld of complex numbers and complex linear operator algebras acting on these spaces. What would be lost (or gained) if a different scalar …eld, e.g. the real numbers or the quaternions, were used? This issue arose with the birthing of the new quantum theory, and over the decades it has been raised over and over again, drawing a variety of different opinions. Here I attempt to identify and to clarify some of the key points of contention, focusing especially on procedures for complexifying real Hilbert spaces and real algebras of observables.
    Found 1 week, 1 day ago on PhilSci Archive
  6. 1134290.298537
    I hope this is my last post for a while on Integrated Information Theory (IIT), in Aaronson’s simplified formulation. One of the fun and well-known facts is that if you have an impractically large square two-dimensional grid of interconnected logic gates (presumably with some constant time-delay in each gate between inputs and outputs to prevent race conditions) in a fixed point (i.e., nothing is changing), the result can still have a degree of integrated information proportional to the square root of the number of gates. …
    Found 1 week, 6 days ago on Alexander Pruss's Blog
  7. 1408337.298544
    I’m still thinking about Integrated Information Theory (IIT), in Aaronson’s simplified formulation. Aaronson’s famous criticisms show pretty convincingly that IIT fails to correctly characterize consciousness: simple but large systems of unchanging logic gates end up having human-level consciousness on IIT. …
    Found 2 weeks, 2 days ago on Alexander Pruss's Blog
  8. 1625648.298549
    The aim of this paper is to present a constructive solution to Frege’s puzzle (largely limited to the mathematical context) based on type theory. Two ways in which an equality statement may be said to have cognitive significance are distinguished. One concerns the mode of presentation of the equality, the other its mode of proof. Frege’s distinction between sense and reference, which emphasizes the former aspect, cannot adequately explain the cognitive significance of equality statements unless a clear identity criterion for senses is provided. It is argued that providing a solution based on proofs is more satisfactory from the standpoint of constructive semantics.
    Found 2 weeks, 4 days ago on PhilSci Archive
  9. 1735755.298555
    This is a report on the project “Axiomatizing Conditional Normative Reasoning” (ANCoR, M 3240-N) funded by the Austrian Science Fund (FWF). The project aims to deepen our understanding of conditional normative reasoning by providing an axiomatic study of it at the propositional but also first-order level. The focus is on a particular framework, the so-called preference-based logic for conditional obligation, whose main strength has to do with the treatment of contrary-to-duty reasoning and reasoning about exceptions. The project considers not only the meta-theory of this family of logics but also its mechanization.
    Found 2 weeks, 6 days ago on X. Parent's site
  10. 1798780.298561
    According to the ω-rule, it is valid to infer that all natural numbers possess some property, if possesses it, 1 possesses it, 2 possesses it, and so on. The ω-rule is important because its inclusion in certain arithmetical theories results in true arithmetic. It is controversial because it seems impossible for finite human beings to follow, given that it seems to require accepting infinitely many premises. Inspired by a remark of Wittgenstein’s, I argue that the mystery of how we follow the ω-rule subsides once we treat the rule as helping to give meaning to the symbol, “…”.
    Found 2 weeks, 6 days ago on PhilSci Archive
  11. 1798814.298576
    We give a new and elementary construction of primitive positive decomposition of higher arity relations into binary relations on finite domains. Such decompositions come up in applications to constraint satisfaction problems, clone theory and relational databases. The construction exploits functional completeness of 2-input functions in many-valued logic by interpreting relations as graphs of partially defined multivalued ‘functions’. The ‘functions’ are then composed from ordinary functions in the usual sense. The construction is computationally effective and relies on well-developed methods of functional decomposition, but reduces relations only to ternary relations. An additional construction then decomposes ternary into binary relations, also effectively, by converting certain disjunctions into existential quantifications. The result gives a uniform proof of Peirce’s reduction thesis on finite domains, and shows that the graph of any Sheffer function composes all relations there.
    Found 2 weeks, 6 days ago on PhilSci Archive
  12. 1798850.298582
    We study logical reduction (factorization) of relations into relations of lower arity by Boolean or relative products that come from applying conjunctions and existential quantifiers to predicates, i.e. by primitive positive formulas of predicate calculus. Our algebraic framework unifies natural joins and data dependencies of database theory and relational algebra of clone theory with the bond algebra of C.S. Peirce. We also offer new constructions of reductions, systematically study irreducible relations and reductions to them, and introduce a new characteristic of relations, ternarity, that measures their ‘complexity of relating’ and allows to refine reduction results. In particular, we refine Peirce’s controversial reduction thesis, and show that reducibility behavior is dramatically different on finite and infinite domains.
    Found 2 weeks, 6 days ago on PhilSci Archive
  13. 1798883.298588
    We argue that traditional formulations of the reduction thesis that tie it to privileged relational operations do not suffice for Peirce’s justification of the categories, and invite the charge of gerrymandering to make it come out as true. We then develop a more robust invariant formulation of the thesis by explicating the use of triads in any relational operations, which is immune to that charge. The explication also allows us to track how Thirdness enters the structure of higher order relations, and even propose a numerical measure of it. Our analysis reveals new conceptual phenomena when negation or disjunction are used to compound relations.
    Found 2 weeks, 6 days ago on PhilSci Archive
  14. 2145012.298593
    The formalism of generalized quantum histories allows a symmetrical treatment of space and time correlations, by taking different traces of the same history density matrix. We recall how to characterize spatial and temporal entanglement in this framework. An operative protocol is presented, to map a history state into the ket of a static composite system. We show, by examples, how the Leggett-Garg and the temporal CHSH inequalities can be violated in our approach.
    Found 3 weeks, 3 days ago on PhilSci Archive
  15. 2230511.2986
    Wilhelm (Forthcom Synth 199:6357–6369, 2021) has recently defended a criterion for comparing structure of mathematical objects, which he calls Subgroup. He argues that Subgroup is better than SYM , another widely adopted criterion. We argue that this is mistaken; Subgroup is strictly worse than SYM . We then formulate a new criterion that improves on both SYM and Subgroup, answering Wilhelm’s criticisms of SYM along the way. We conclude by arguing that no criterion that looks only to the automorphisms of mathematical objects to compare their structure can be fully satisfactory.
    Found 3 weeks, 4 days ago on James Owen Weatherall's site
  16. 2606637.298605
    Given synthetic Euclidean geometry, I define length λ(a, b) (of a segment ab), by taking equivalence classes with respect to the congruence relation, ≡: i.e., λ(a, b) = λ(c, d) ↔ ab ≡ cd. By geometric constructions and explicit definitions, one may define the Length structure, L = (L, ,⊕,⪯, ), “instantiated by Euclidean geometry”, so to speak. One may show that this structure is isomorphic to the set of non-negative elements of the one-dimensional linearly ordered vector space over R. One may define the notion of a numerical scale (for length) and a unit (for length). One may show how numerical scales for length are determined by Cartesian coordinate systems. One may also obtain a derivation of Maxwell’s quantity formula, Q = {Q}[Q], for lengths.
    Found 1 month ago on PhilSci Archive
  17. 2722047.298611
    Let us consider an acyclic causal model M of the sort that is central to causal modeling (Spirtes et al. 1993/2000, Pearl 2000/2009, Halpern 2016, Hitchcock 2018). Readers familiar with them can skip this section. M = , F ⟩ is a causal model if, and only if, is a signature and = {F1 , . . . , Fn represents a set of n structural equations, for a finite natural number n. S = , , R is a signature if, and only if, is a finite set of exogenous variables, V = V1 , . . . ,Vn is a set of n endogenous variables that is disjoint from U, and R : U ∪ V → R assigns to each exogenous or endogenous variable X in U ∪ V its range (not co-domain) R (X) ⊆ R. F = F1 , . . . , Fn represents a set of n structural equations if, and only if, for each natural number i, 1 ≤ i ≤ n: Fi is a function from the Cartesian product i = X∈U∪V\{Vi R (X) of the ranges of all exogenous and endogenous variables other than Vi into the range R (Vi) of the endogenous variable Vi. The set of possible worlds of the causal model M is defined as the Cartesian productW = X∈U∪VR (X) of the ranges of all exogenous and endogenous variables.
    Found 1 month ago on PhilSci Archive
  18. 2722365.298616
    To analyse contingent propositions, this paper investigates how branching time structures can be combined with probability theory. In particular, it considers assigning infinitesimal probabilities—available in non-Archimedean probability theory—to individual histories. This allows us to introduce the concept of ‘remote possibility’ as a new modal notion between ‘impossibility’ and ‘appreciable possibility’. The proposal is illustrated by applying it to a future contingent and a historical counterfactual concerning an infinite sequence of coin tosses. The latter is a toy model that is used to illustrate the applicability of the proposal to more realistic physical models.
    Found 1 month ago on PhilSci Archive
  19. 2955115.298623
    A wide variety of stochastic models of cladogenesis (based on speciation and extinction) lead to an identical distribution on phylogenetic tree shapes once the edge lengths are ignored. By contrast, the distribution of the tree’s edge lengths is generally quite sensitive to the underlying model. In this paper, we review the impact of different model choices on tree shape and edge length distribution, and its impact for studying the properties of phylogenetic diversity (PD) as a measure of biodiversity, and the loss of PD as species become extinct at the present. We also compare PD with a stochastic model of feature diversity, and investigate some mathematical links and inequalities between these two measures plus their predictions concerning the loss of biodiversity under extinction at the present.
    Found 1 month ago on Mike Steel's site
  20. 3301244.298629
    We give a new coalgebraic semantics for intuitionistic modal logic with 2. In particular, we provide a colagebraic representation of intuitionistic descriptive modal frames and of intuitonistic modal Kripke frames based on image-finite posets. This gives a solution to an implicit problem in the area of coalgebaic logic for these classes of frames, raised explicitly by Litak (2014) and de Groot and Pattinson (2020). Our key technical tool is a recent generalization of a construction by Ghilardi, in the form of a right adjoint to the inclusion of the category of Esakia spaces in the category of Priestley spaces. As an application of these results, we study bisimulations of intuitionistic modal frames, describe dual spaces of free modal Heyting algebras, and provide a path towards a theory of coalgebraic intuitionistic logics.
    Found 1 month, 1 week ago on Nick Bezhanishvili's site
  21. 3301278.298635
    The Goldblatt-Thomason theorem is a classic result of modal definability of Kripke frames. Its topological analogue for the closure semantics has been proved by ten Cate et al. (2009). In this paper we prove a version of the Goldblatt-Thomason theorem for topological semantics via the Cantor derivative. We work with derivative spaces which provide a natural generalisation of topological spaces on the one hand and of weakly transitive frames on the other.
    Found 1 month, 1 week ago on Nick Bezhanishvili's site
  22. 3301320.298641
    Polyhedral semantics is a recently introduced branch of spatial modal logic, in which modal formulas are interpreted as piecewise linear subsets of an Euclidean space. Polyhedral semantics for the basic modal language has already been well investigated. However, for many practical applications of polyhedral semantics, it is advantageous to enrich the basic modal language with a reachability modality. Recently, a language with an Until-like spatial modality has been introduced, with demonstrated applicability to the analysis of 3D meshes via model checking. In this paper, we exhibit an axiom system for this logic, and show that it is complete with respect to polyhedral semantics. The proof consists of two major steps: First, we show that this logic, which is built over Grzegorczyk’s system Grz, has the finite model property. Subsequently, we show that every formula satisfied in a finite poset is also satisfied in a polyhedral model, thereby establishing polyhedral completeness.
    Found 1 month, 1 week ago on Nick Bezhanishvili's site
  23. 3361647.298646
    Natural language does not express all connectives definable in classical logic as simple lexical items. Coordination in English is expressed by conjunction and, disjunction or, and negated disjunction nor. Other languages pattern similarly. Non-lexicalized connectives are typically expressed compositionally: in English, negated conjunction is typically expressed by combining negation and conjunction (not both). This is surprising: if ∧ and ∨ are duals, and the negation of the latter can be expressed lexically (nor), why not the negation of the former? I present a two-tiered model of the semantics of the binary connectives. The first tier captures the expressive power of the lexicon: it is a bilateral state-based semantics that, under a restriction, can express all and only the distinctions that can be expressed by the lexicon of natural language (and, or, nor). This first tier is characterized by rejection as non-assertion and a Neglect Zero assumption. The second tier is obtained by dropping the Neglect Zero assumption and enforcing a stronger notion of rejection, thereby recovering classical logic and thus definitions for all Boolean connectives. On the two-tiered model, we distinguish the limited expressive resources of the lexicon and the greater combinatorial expressive power of the language as a whole. This gives us a logic-based account of compositionality for the Boolean fragment of the language.
    Found 1 month, 1 week ago on Rush T. Stewart's site
  24. 3477006.298652
    In this article, I try to shed new light on Frege’s envisaged definitional introduction of real and complex numbers in Die Grundlagen der Arithmetik (1884) and the status of cross-sortal identity claims with side glances at Grundgesetze der Arithmetik (vol. I 1893, vol. II 1903). As far as I can see, this topic has not yet been discussed in the context of Grundlagen. I show why Frege’s strategy in the case of the projected definitions of real and complex numbers in Grundlagen is modelled on his definitional introduction of cardinal numbers in two steps, tentatively via a contextual definition and finally and definitively via an explicit definition. I argue that the strategy leaves a few important questions open, in particular one relating to the status of the envisioned abstraction principles for the real and complex numbers and another concerning the proper handling of cross-sortal identity claims.
    Found 1 month, 1 week ago on Rush T. Stewart's site
  25. 3534744.298657
    In this paper we use proof-theoretic methods, specifically sequent calculi, admissibility of cut within them and the resultant subformula property, to examine a range of philosophically-motivated deontic logics. We show that for all of those logics it is a (meta)theorem that the Special Hume Thesis holds, namely that no purely normative conclusion follows non-trivially from purely descriptive premises (nor vice versa). In addition to its interest on its own, this also illustrates one way in which proof theory sheds light on philosophically substantial questions.
    Found 1 month, 1 week ago on Rush T. Stewart's site
  26. 3910352.298663
    We’ve been hard at work here in Edinburgh. Kris Brown has created Julia code to implement the ‘stochastic C-set rewriting systems’ I described last time. I want to start explaining this code and also examples of how we use it. …
    Found 1 month, 2 weeks ago on Azimuth
  27. 3933990.298669
    Judgment-aggregation theory has always focused on the attainment of rational collective judgments. But so far, rationality has been understood in static terms: as coherence of judgments at a given time, defined as consistency, completeness, and/or deductive closure. This paper asks whether collective judgments can be dynamically rational, so that they change rationally in response to new information. Formally, a judgment aggregation rule is dynamically rational with respect to a given revision operator if, whenever all individuals revise their judgments in light of some information (a learnt proposition), then the new aggregate judgments are the old ones revised in light of this information, i.e., aggregation and revision commute. We prove an impossibility theorem: if the propositions on the agenda are non-trivially connected, no judgment aggregation rule with standard properties is dynamically rational with respect to any revision operator satisfying some basic conditions. Our theorem is the dynamic-rationality counterpart of some well-known impossibility theorems for static rationality. We also explore how dynamic rationality might be achieved by relaxing some of the conditions on the aggregation rule and/or the revision operator. Notably, premise-based aggregation rules are dynamically rational with respect to so-called premise-based revision operators.
    Found 1 month, 2 weeks ago on PhilSci Archive
  28. 3934059.298674
    In quantum mechanics, we appeal to decoherence as a process that explains the emergence of a quasi-classical order. Decoherence has no classical counterpart. Moreover, it is an apparently irreversible process [1–7]. In this paper, we investigate the nature and origin of its irreversibility. Decoherence and quantum entanglement are two physical phenomena that tend to go together. The former relies on the latter, but the reverse is not true. One can imagine a simple bipartite system in which two microscopic subsystems are initially unentangled and become entangled at the end of the interaction. Decoherence does not occur, since neither system is macroscopic. Nevertheless, we will still need to quantify entanglement in order to describe the arrow of time associated with decoherence, because it occurs when microscopic systems become increasingly entangled with the degrees of freedom in their macroscopic environments. To do this we need to define entanglement entropy in terms of the sum of the von Neumann entropies of the subsystems.
    Found 1 month, 2 weeks ago on PhilSci Archive
  29. 3934169.298681
    Duality in the Exact Sciences: The Application to Quantum Mechanics.
    Found 1 month, 2 weeks ago on PhilSci Archive
  30. 4165181.298686
    In this brief note I will try to develop the following thesis: Gödel’s program includes a rich and exciting task for the philosopher that has been overlooked by the majority of the philosophers of set theory (let alone set theorists). Gödel’s program intends, in a nutshell, to solve Cantor’s Continuum Hypothesis (hereafter, CH) as legitimate problem by means of the addition of new axioms to ZFC that satisfy some criteria of naturalness and that, moreover, allow to derive either CH or its negation. Hence, the view encapsulated by such program clashes violently with other attitudes towards the status of CH, like those defending that CH is a problem but is solved by the independence phenomenon itself , those that argue that CH is a vague statement and therefore is ill-posed as a problem and, finally, those that regard the axiom-adding proposals as incapable of settling the question .
    Found 1 month, 2 weeks ago on PhilSci Archive