1. 107084.029843
    The original architects of the representational theory of measurement interpreted their formalism operationally and explicitly acknowledged that some aspects of their representations are conventional. We argue that the conventional elements of the representations afforded by the theory require careful scrutiny as one moves toward a more metaphysically robust interpretation by showing that there is a sense in which the very number system one uses to represent a physical quantity such as mass or length is conventional. This result undermines inferences which impute structure from the numerical representational structure to the quantity it is used to represent.
    Found 1 day, 5 hours ago on Michael E. Miller's site
  2. 156227.0301
    Last century, Michael Dummett argued that the principles of intuitionistic logic are semantically neutral, and that classical logic involves a distinctive commitment to realism. The ensuing debate over realism and anti-realism and intuitionistic logic has now receded from view. The situation is reversed in mathematics: constructive reasoning has become more popular in the 21st century with the rise of proof assistants based on constructive type theory. In this paper, I revisit Dummett’s concerns in the light of these developments, arguing that both constructive and classical reasoning are recognisable and coherent assertoric and inferential practices.
    Found 1 day, 19 hours ago on Greg Restall's site
  3. 378217.030136
    We investigate whether ordinary quantification over objects is an extensional phenomenon, or rather creates non-extensional contexts; each claim having been propounded by prominent philosophers. It turns out that the question only makes sense relative to a background theory of syntax and semantics (here called a grammar) that goes well beyond the inductive definition of formulas and the recursive definition of satisfaction. Two schemas for building quantificational grammars are developed, one that invariably constructs extensional grammars (in which quantification, in particular, thus behaves extensionally) and another that only generates non-extensional grammars (and in which quantification is responsible for the failure of extensionality). We then ask whether there are reasons to favor one of these grammar schemas over the other, and examine an argument according to which the proper formalization of deictic utterances requires adoption of non-extensional grammars.
    Found 4 days, 9 hours ago on Kai F. Wehmeier's site
  4. 378247.030157
    Given any set E of expressions freely generated from a set of atoms by syntactic operations, there exist trivially compositional functions on E (to wit, the injective and the constant functions), but also plenty of non-trivially compositional functions. Here we show that within the space of non-injective functions (and so a fortiori within the space of non-injective and non-constant functions), compositional functions are not sufficiently abundant in order to generate the consequence relation of every propositional logic. Logical consequence relations thus impose substantive constraints on the existence of compositional functions when coupled with the condition of noninjectivity (though not without it). We ask how the apriori exclusion of injective functions from the search space might be justified, and we discuss the prospects of claims to the effect that any function can be “encoded” in a compositional one.
    Found 4 days, 9 hours ago on Kai F. Wehmeier's site
  5. 396528.030173
    The 2021 Nobel Prize in Economics recognized a theory of causal inference that warrants more attention from philosophers. To this end, I design a tutorial on that theory for philosophers and develop a dialectic that connects to a traditional debate in philosophy: the Lewis-Stalnaker debate on Conditional Excluded Middle (CEM). I first defend CEM, presenting a new Quine-Putnam indispensability argument based on the Nobel-winning application of the Rubin causal model (the potential outcome framework). Then, I switch sides to challenge this argument, introducing an updated version of the Rubin causal model that preserves the successful application while dispensing with CEM.
    Found 4 days, 14 hours ago on PhilSci Archive
  6. 396583.03019
    This paper explores the artificial intelligence (AI) containment problem, specifically addressing the challenge of creating effective safeguards for artificial general intelligence (AGI) and superintelligence. I argue that complete control—defined as full predictability of AI actions and total adherence to safety requirements—is unattainable. The paper reviews five key constraints: incompleteness, indeterminacy, unverifiability, incomputability, and incorrigibility. These limitations are grounded in logical, philosophical, mathematical, and computational theories, such as Gödel’s incompleteness theorem and the halting problem, which collectively prove the impossibility of AI containment. I argue that instead of pursuing complete AI containment, resources should be allocated to risk management strategies that acknowledge AI’s unpredictability and prioritize adaptive oversight mechanisms.
    Found 4 days, 14 hours ago on PhilSci Archive
  7. 620689.030206
    Today I’d like to dig a little deeper into some ideas from Part 2. I’ve been talking about causal loop diagrams. Very roughly speaking, a causal loop diagram is a graph with labeled edges. I showed how to ‘pull back’ and ‘push forward’ these labels along maps of graphs. …
    Found 1 week ago on Azimuth
  8. 664654.030221
    Bisimulations are standard in modal logic and, more generally, in the theory of state-transition systems. The quotient structure of a Kripke model with respect to the bisimulation relation is called a bisimulation contraction. The bisimulation contraction is a minimal model bisimilar to the original model, and hence, for (image-)finite models, a minimal model modally equivalent to the original. Similar definitions exist for bounded bisimulations (k-bisimulations) and bounded bisimulation contractions. Two finite models are k-bisimilar if and only if they are modally equivalent up to modal depth k. However, the quotient structure with respect to the k-bisimulation relation does not guarantee a minimal model preserving modal equivalence to depth k. In this paper, we remedy this asymmetry to standard bisimulations and provide a novel definition of bounded contractions called rooted k-contractions. We prove that rooted k-contractions preserve k-bisimilarity and are minimal with this property. Finally, we show that rooted k-contractions can be exponentially more succinct than standard k-contractions.
    Found 1 week ago on Thomas Bolander's site
  9. 739978.030235
    Truthmaker semantics is a non-classical logical framework that has recently garnered significant interest in philosophy, logic, and natural language semantics. It redefines the propositional connectives and gives rise to more fine-grained entailment relations than classical logic. In its model theory, truth is not determined with respect to possible worlds, but with respect to truthmakers, such as states or events. Unlike possible worlds, these truthmakers may be partial; they may be either coherent or incoherent; and they are understood to be exactly or wholly relevant to the truth of the sentences they verify. Truth-maker semantics generalizes collective, fusion-based theories of conjunction; alternative-based theories of disjunction; and nonstandard negation semantics. This article provides a gentle introduction to truthmaker semantics aimed at linguists; describes applications to various natural language phenomena such as imperatives, ignorance implicatures, and negative events; and discusses its similarities and differences to related frameworks such as event semantics, situation semantics, alternative semantics, and inquisitive semantics.
    Found 1 week, 1 day ago on Lucas Champollion's site
  10. 760385.03025
    In Part 2, I explained some stuff you can do with graphs whose edges are labeled by elements of a rig. Remember, a rig is like a ring, but it might not have negatives. A great example is the boolean rig, whose elements are truth values: The addition in this rig is ‘or’ and the multiplication is ‘and’. …
    Found 1 week, 1 day ago on Azimuth
  11. 807792.030264
    In the previous post, I showed that Goodman and Quine’s counting method fails for objects that have too much overlap. I think (though the technical parts here are more difficult) that the same is true for their definition of the ancestral or transitive closure of a relation. …
    Found 1 week, 2 days ago on Alexander Pruss's Blog
  12. 1147030.030281
    Generative artificial intelligence (AI) applications based on large language models have not enjoyed much success in symbolic processing and reasoning tasks, thus making them of little use in mathematical research. However, recently DeepMind’s AlphaProof and AlphaGeometry 2 applications have recently been reported to perform well in mathematical problem solving. These applications are hybrid systems combining large language models with rule-based systems, an approach sometimes called neuro-symbolic AI. In this paper, I present a scenario in which such systems are used in research mathematics, more precisely in theorem proving. In the most extreme case, such a system could be an autonomous automated theorem prover (AATP), with the potential of proving new humanly interesting theorems and even presenting team in research papers. The use of such AI applications would be transformative to mathematical practice and demand clear ethical guidelines. In addition to that scenario, I identify other, less radical, uses of generative AI in mathematical research. I analyse how guidelines set for ethical AI use in scientific research can be applied in the case of mathematics, arguing that while there are many similarities, there is also a need for mathematics-specific guidelines.
    Found 1 week, 6 days ago on PhilSci Archive
  13. 1204742.0303
    The universal conception of necessity says that necessary truth is truth in all possible worlds. This idea is well studied in the context of classical possible worlds models, and there its logic is S5. The universal conception of necessity is less well studied in models for non-classical logics. We will present some preliminary results on universal necessity on models for intuitionistic logic, first-degree entailment, and relevant logics. We will close by discussing a way in which universal necessity is a very classical concept.
    Found 1 week, 6 days ago on Shawn Standefer's site
  14. 1204770.030326
    A challenge for relevant logicians is to delimit their area of study. I propose and explore the definition of a relevant logic as a logic satisfying a variable-sharing property and closed under detachment and adjunction. This definition is, I argue, a good definition that captures many familiar logics and raises interesting new questions concerning relevant logics. As is familiar to readers of Entailment or Relevant Logics and Their Rivals, the motivations for relevant logics have a strong intuitive pull. The philosophical picture put forward by Anderson and Belnap (1975), for example, is compelling and has led to many fruitful developments. With some practice, one can develop a feel for what sorts of axioms or rules lead to violations of relevance in standard relevant logics. These sorts of intuitions only go so far, as some principles that lead to violations of relevance in stronger logics are compatible with it in weaker logics. There is a large number of relevant logics, but there is not much discussion of precise characterizations of the class of relevant logics.
    Found 1 week, 6 days ago on Shawn Standefer's site
  15. 1320298.030341
    The idea that life is to be understood in terms of information has strongly taken hold in recent decades. I discuss two attempts to carry this through mathematically. G. J. Chaitin, co-founder of algorithmic information theory, proposes an information-theoretic definition of life in terms of organized complexity (Chaitin 1990a and 1990b). More recently, William Dembski, Winston Ewart, and Robert Marks have attempted to formulate in information-theoretic terms Dembski’s concept of specified complexity, using a mathematically hybrid entity they term “algorithmic specified complexity” (Ewart, Dembski, and Marks 2013a, 2014, 2015a, 2015b), and Dembski and Ewart have reformulated this concept in their newly revised edition (Dembski and Ewart 2023) of Dembski’s The Design Inference (Dembski 1998). The aim in both cases is to mathematically distinguish informational properties of biological complexity, in contrast to simple order, on the one hand, and mere randomness on the other. Moreover, the respective mathematical strategies are the same: To take an informational measure and subtract out its randomness, leaving a remainder of organization (Chaitin) or specified complexity (Dembski et al.).
    Found 2 weeks, 1 day ago on PhilSci Archive
  16. 1362933.030364
    Linsky & Zalta (1994) argued that simplest quantified modal logic (SQML), with its fixed domain, can be given an actualist interpretation if the Barcan formula is interpreted to conditionally assert the existence of contingently nonconcrete objects. But SQML itself doesn’t require the existence of such objects; in interpretations of SQML in which there is only one possible world, there are no contingent objects, nonconcrete or otherwise. I defend an axiom for SQML that will provably (a) force the domain to have the relevant objects and thereby (b) force the existence of more than one possible world, thereby forestalling modal collapse. I show that the new axiom can be justified by describing the theorems that can be proved when it is added to SQML. I further justify the axiom by the reviewing the theorems the axiom allows us to prove when we assume object theory (‘OT’), in its latest incarnation, as a background framework. Finally, I consider the conclusions one can draw when we consider the new axiom in connection with actualism, as this view has been (re-)characterized in recent work.
    Found 2 weeks, 1 day ago on Ed Zalta's site
  17. 1390580.030395
    I’m talking about ‘causal loop diagrams’, which are graph with edges labeled by ‘polarities’. Often the polarities are simply and signs, like here: But polarities can be elements of any monoid, and last time I argued that things work even better if they’re elements of a rig, so you can not only multiply them but also add them. …
    Found 2 weeks, 2 days ago on Azimuth
  18. 1488385.030417
    In Part 1 I explained ‘causal loop diagrams’, which are graphs with edges labeled by polarities. These are a way to express qualitatively, rather than quantitatively, how entities affect one another. For example, here’s how causal loop diagrams us say that alcoholism ‘tends to increase’ domestic violence: We don’t need to specify any numbers, or even need to say what we mean by ‘tends to increase’, though that leads to the danger of using the term in a very loose way. …
    Found 2 weeks, 3 days ago on Azimuth
  19. 1560654.030439
    This is a progress report on some joint work with Xiaoyan Li, Nathaniel Osgood and Evan Patterson. Together with collaborators we have been developing software for ‘system dynamics’ modelling, and applying it to epidemiology—though it has many other uses. …
    Found 2 weeks, 4 days ago on Azimuth
  20. 1921473.030461
    I have elsewhere shown the consistency of the theory commonly called New Foundations or NF, originally proposed by W. v. O. Quine in his paper “New foundations for mathematical logic”. In this note, I review that original paper and may eventually review some other sources one might consult for information about this theory . Quine himself made some errors in this paper and later in his discussion of NF, and there are other characteristic difficulties that people have with this system which such a review might allow us to discuss.
    Found 3 weeks, 1 day ago on M. Randall Holmes's site
  21. 1928579.030487
    Thurston’s paper Shapes of polyhedra and triangulations of the sphere is really remarkable. I’m writing about it in my next column for the Notices of the American Mathematical Society. Here’s a draft — which is also a much more detailed version of an earlier blog post here. …
    Found 3 weeks, 1 day ago on Azimuth
  22. 2399482.03051
    A number of rules for resolving majority cycles in elections have been proposed in the literature. Recently, Holliday and Pacuit (J Theor Polit 33:475–524, 2021) axiomatically characterized the class of rules refined by one such cycle-resolving rule, dubbed Split Cycle: in each majority cycle, discard the majority preferences with the smallest majority margin. They showed that any rule satisfying five standard axioms plus a weakening of Arrow’s Independence of Irrelevant Alternatives (IIA), called Coherent IIA, is refined by Split Cycle. In this paper, we go further and show that Split Cycle is the only rule satisfying the axioms of Holliday and Pacuit together with two additional axioms, which characterize the class of rules that refine Split Cycle: Coherent Defeat and Positive Involvement in Defeat. Coherent Defeat states that any majority preference not occurring in a cycle is retained, while Positive Involvement in Defeat is closely related to the well-known axiom of Positive Involvement (as in J Pérez Soc Choice Welf 18:601–616, 2001). We characterize Split Cycle not only as a collective choice rule but also as a social choice correspondence, over both profiles of linear ballots and profiles of ballots allowing ties.
    Found 3 weeks, 6 days ago on Eric Pacuit's site
  23. 2416681.030532
    In this article, the suppositional account and different approaches of relevance conditionals are analysed on a specific type of conditional: Conditionals whose antecedent and consequent have a relevance connection, but where the acceptability of the antecedent has no influence on the acceptability of the consequent. Such conditionals occur in cases of multiple implication of a consequent, as in overdetermination. When evaluating such conditionals, the approaches examined lead to different and partly incoherent results. It is argued that approaches to conditionals should consider such conditionals acceptable, which is a challenge for e.g. approaches based on statistical measures. Furthermore, it is argued that the probability of a conditional should be evaluated only according to the strength of the relevance connection between the antecedent and the consequent, but not according to other relevance connections. It is shown that only two approaches correctly evaluate such conditionals, one of which, inferentialism, may provide a basis for a coherent theory of conditionals.
    Found 3 weeks, 6 days ago on PhilSci Archive
  24. 2816775.030563
    I claim that there is no general, straightforward and satisfactory way to define a total comparative probability with the standard axioms using full conditional probabilities. By a “straightforward” way, I mean something like: - A ≲ B iff P(A−B|AΔB) ≤ P(B−A|AΔB) (De Finetti) or: - A ≲ B iff P(A|A∪B) ≤ P(A|A∪B) (Pruss). …
    Found 1 month ago on Alexander Pruss's Blog
  25. 3133322.030587
    I claim that there is no general, straightforward and satisfactory way to define a total compa.rative probability with the standard axioms using full conditional probabilities. By a “straightforward” way, I mean something like: - A ≲ B iff P(A−B|AΔB) ≤ P(B−A|AΔB) (De Finetti) or: - A ≲ B iff P(A|A∪B) ≤ P(A|A∪B) (Pruss). …
    Found 1 month ago on Alexander Pruss's Blog
  26. 3166843.03061
    We attempt to reconstruct Hans Reichenbach’s arguments for a macroscopic causal definition of the direction of time. Our analysis reveals that Reichenbach’s formulation of “screening off” is equivocal between the now common notion of conditional independence of two variables given others and a weaker notion that requires the conditional independence only for specific values of the variables. We also find that on the now common notion of screening off his own conditions for the “usual…conjunctive forks” are mathematically impossible for binary variables. Finally, we note that as a corollary to his familiar Principle of the Common Cause, Reichenbach’s argument embraces a No Fatalism principle that forbids explaining earlier probabilistic associations by values of later variables.
    Found 1 month ago on PhilSci Archive
  27. 3208521.030636
    Works by (Humberstone 1981, 2011), van Benthem (1981, 2016), Holliday 2014, forthcoming, and Ding & Holliday 2020 attempt to develop a semantics of modal logic in terms of “possibilities”, i.e., “less determinate entities than possible worlds” (Edgington 1985). These works take possibilities as semantically primitive entities, stipulate a number of semantic principles that govern these entities (namely, Ordering, Persistence, Refinement, Cofinality, Negation, and Conjunction), and then interpret a modal language via this semantic structure. In this paper, we define possibilities in object theory (OT), and derive, as theorems, the semantic principles stipulated in the works cited. We then raise a concern for the semantic investigation of possibilities without a modal operator, and show that no such concerns the metaphysics of possibilities as developed in OT.
    Found 1 month ago on Ed Zalta's site
  28. 3570935.030658
    In previous work the author has proposed a different approach to the problem of von Neumann measurement and wave function collapse. Here we apply it to the collapse of degenerate states. Our predictions differ from those of von Neumann and, separately, Lüders in significant ways. An experiment is suggested that might distinguish between the possibilities.
    Found 1 month, 1 week ago on PhilSci Archive
  29. 3859457.030679
    In this paper I will address three topics in the logic of conditionals. The first is the question whether the class of ‘reasonable’ probability functions must be closed under conditionalization. The second topic is the character of logical consequence when probabilities of conditionals come into play. The third is more specific: I want to present a challenge to the possible worlds approach in formal semantics, in favor of an algebraic approach. For this I will use as a case study Alan Hajek’s views on counterfactual conditionals, and its problems with infinity. Included in this will be reasons to expect algebras of propositions to be incomplete algebras. Throughout I will use as foil what is known variously as Stalnaker’s Thesis, or the Conditional Construal of Conditional Probability (CCCP). That is the thesis that the probability of a conditional A → B is the conditional probability of B given A, when defined. That the CCCP is tenable for a reasonable logic of conditionals I will presuppose in the body of the paper, but I will present its credentials in the Appendix. The CCCP is to be distinguished from the Extended Stalnaker’s Thesis, or Extended CCCP, that the conditional probability of A → B given C equals the conditional probability of B given A and C. That extended thesis has been demolished again and again, and will appear here only in a note, to be dismissed.
    Found 1 month, 1 week ago on PhilSci Archive
  30. 4090388.030704
    Most believe that there are no empirical grounds that make the adoption of quantum logic necessary. Ian Rumfitt has further argued that this adoption is not possible, either, for the proof that distribution fails in quantum mechanics is rule-circular or unsound. I respond to Rumfitt, by showing that neither is the case: rule-circularity disappears when an appropriate semantics is considered, and soundness is restored by slightly modifying standard quantum mechanics. Thus, albeit this is indeed not necessary, it is however possible for a quantum logician to rationally adjudicate against classical logic.
    Found 1 month, 2 weeks ago on PhilSci Archive