1. 288319.725772
    A correspondent asked me how a simple God can choose. I've thought much about this, never quite happy with what I have to say. I am still not happy (nor is it surprising if "how God functions" is beyond us!) …
    Found 3 days, 8 hours ago on Alexander Pruss's Blog
  2. 521667.725833
    This paper investigates histories in Branching Space-Time (BST) structures. We start by identifying necessary and sufficient conditions for the existence of free histories, and then we turn to the intangibility problem, and we show that the existence of histories in BST structures is equivalent to the axiom of choice, yielding the punchline “history gives us choice”.
    Found 6 days ago on PhilSci Archive
  3. 712084.725841
    In this paper, we propose a novel algorithm for epistemic planning based on dynamic epistemic logic (DEL). The novelty is that we limit the depth of reasoning of the planning agent to an upper bound b, meaning that the planning agent can only reason about higher-order knowledge to at most (modal) depth b. The algorithm makes use of a novel type of canonical b-bisimulation contraction guaranteeing unique minimal models with respect to b-bisimulation. We show our depth-bounded planning algorithm to be sound. Additionally, we show it to be complete with respect to planning tasks having a solution within bound b of reasoning depth (and hence the iterative bound-deepening variant is complete in the standard sense). For bound b of reasoning depth, the algorithm is shown to be (b + 1)-E XPTIME complete, and furthermore fixed-parameter tractable in the number of agents and atoms. We present both a tree search and a graph search variant of the algorithm, and we benchmark an implementation of the tree search version against a baseline epistemic planner.
    Found 1 week, 1 day ago on Thomas Bolander's site
  4. 1152165.725847
    Consider a non-relativistic quantum particle with wave function inside a region ⊂ R , and suppose that detectors are placed along the boundary. The question how to compute the probability distribution of the time at which the detector surface registers the particle boils down to finding a reasonable mathematical definition of an ideal detecting surface; a particularly convincing definition, called the absorbing boundary rule, involves a time evolution for the particle’s wave function ψ expressed by a Schrödinger equation in together with an “absorbing” boundary condition on first considered by Werner in 1987, viz., ∂ψ/∂n = iκψ with κ > 0 and ∂/∂n the normal derivative. We provide here a discussion of the rigorous mathematical foundation of this rule. First, for the viability of the rule it plays a crucial role that these two equations together uniquely define the time evolution of ψ; we point out here how, under some technical assumptions on the regularity (i.e., smoothness) of the detecting surface, the Lumer-Phillips theorem implies that the time evolution is well defined and given by a contraction semigroup. Second, we show that the collapse required for the N-particle version of the problem is well defined. We also prove that the joint distribution of the detection times and places, according to the absorbing boundary rule, is governed by a positive-operator-valued measure.
    Found 1 week, 6 days ago on R. Tumulka's site
  5. 2082031.725857
    The early study Tennant [11] sought to show how the role played by formal semantics in furnishing models that would invalidate unprovable first-order arguments from premise-sets to conclusions could be taken over by proofs and disproofs. (A disproof of a set of premises is a proof of , i.e., absurdity, from it.) For any given invalid first-order argument, these latter would be proofs and disproofs in Peano Arithmetic (PA), employing suitable substitutions of arithmetical predicates for the primitive predicates involved in the argument. PA-proofs would be furnished for the premises of the invalid argument, and a PA-disproof would be furnished for its conclusion. This was an early move towards a general proof-theoretic semantics—the approach to By a theorem of Hilbert and Bernays [4], these arithmetical predicates can be taken to be of arithmetical complexity no greater than Δ .
    Found 3 weeks, 3 days ago on Neil Tennant's site
  6. 2938318.725862
    The family of relevant logics can be faceted by a hierarchy of increasingly fine-grained variable sharing properties—requiring that in valid entailments A → B, some atom must appear in both A and B with some additional condition (e.g., with the same sign or nested within the same number of conditionals). In this paper, we consider an incredibly strong variable sharing property of lericone relevance that takes into account the path of negations and conditionals in which an atom appears in the parse trees of the antecedent and consequent. We show that this property of lericone relevance holds of the relevant logic BM (and that a related property of faithful lericone relevance holds of B) and characterize the largest fragments of classical logic with these properties. Along the way, we consider the consequences for lericone relevance for the theory of subject-matter, for Logan’s notion of hyperformalism, and for the very definition of a relevant logic itself.
    Found 1 month ago on Shawn Standefer's site
  7. 3042954.725867
    This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 License doi.org/10.3998/phimp.3806 Suppose you are 40% confident that Candidate X will win in the upcoming election. Then you read a column projecting 80%. If you and the columnist are equally well informed and competent on this topic, how should you revise your opinion in light of theirs? Should you perhaps split the difference, arriving at 60%? Plenty has been written on this topic. Much less studied, however, is the question what comes next. Once you’ve updated your opinion about Candidate X, how should your other opinions change to accommodate this new view? For example, how should you revise your expectations about other candidates running for other seats? Or your confidence that your preferred party will win a majority?
    Found 1 month ago on Philosopher's Imprint
  8. 3111480.725873
    We re-examine the old question to what extent mathematics may be compared with a game. Mainly inspired by Hilbert and Wittgenstein, our answer is that mathematics is something like a “rhododendron of language games”, where the rules are inferential. The pure side of mathematics is essentially formalist, where we propose that truth is not carried by theorems corresponding to whatever independent reality and arrived at through proof, but is defined by correctness of rule-following (and as such is objective given these rules). Gödel’s theorems, which are often seen as a threat to formalist philosophies of mathematics, actually strengthen our concept of truth. The applied side of mathematics arises from two practices: first, the dual nature of axiomatization as taking from heuristic practices like physics and informal mathematics whilst giving proofs and logical analysis; and second, the ability of using the inferential role of theorems to make “surrogative” inferences about natural phenomena. Our framework is pluralist, combining various (non-referential) philosophies of mathematics.
    Found 1 month ago on PhilSci Archive
  9. 3111642.725878
    This paper proposes an alternative to standard first-order logic that seeks greater naturalness, generality, and semantic self-containment. The system removes the first-order restriction, avoids type hierarchies, and dispenses with external structures, making the meaning of expressions depend solely on their constituent symbols. Terms and formulas are unified into a single notion of expression, with set-builder notation integrated as a primitive construct. Connectives and quantifiers are treated as operators among others rather than as privileged primitives. The deductive framework is minimal and intuitive, with soundness and consistency established and completeness examined. While computability requirements may limit universality, the system offers a unified and potentially more faithful model of human mathematical deduction, providing an alternative foundation for formal reasoning.
    Found 1 month ago on PhilSci Archive
  10. 3222483.725887
    Some important policies will change future mortality rates (like climate mitigation), change future fertility rates (like public education), or respond to the emerging challenges of global depopulation. Any such policy will change each of the quality of lives, the quantity of lives, and who will live in the future. Hence, to evaluate economic policies, we need to assess both social risk and variable population. A standard principle for economic policy evaluation is Expected Total Utilitarianism, which maximizes the expected value of the sum of individuals’ transformed lifetime well-being. Despite the prominent use in public economics of both additive utilitarianism and expectation-taking under risk, these methods remain questionable in welfare economics, in part because existing axiomatic justifications make strong assumptions (Fleurbaey, 2010; Golosov et al., 2007).
    Found 1 month ago on Johan E. Gustafsson's site
  11. 3227093.725898
    The meta-inductive approach to induction justifies induction by proving its optimality. The argument for the optimality of induction proceeds in two steps. The first ‘a priori’ step intends to show that meta-induction is optimal and the second ‘a posteriori’ step intends to show that meta-induction selects object-induction in our world. I critically evaluate the second-step and raise two problems: the identification problem and the indetermination problem. In light of these problems, I assess the prospects of any meta-inductive approach to induction.
    Found 1 month ago on PhilSci Archive
  12. 3227115.725903
    While causal models are introduced very much like a formal logical system, they have not yet been taken to the level of a proper logic of causal reasoning with structural equations. In this paper, we furnish causal models with a distinct deductive system and a corresponding model-theoretic semantics. Interventionist conditionals will be defined in terms of inferential relations in this logic of causal models.
    Found 1 month ago on PhilSci Archive
  13. 3445738.725908
    In Part 9 we saw, loosely speaking, that the theory of a hydrogen atom is equivalent to the theory of a massless left-handed spin-½ particle in the Einstein universe—a static universe where space is a 3-sphere. …
    Found 1 month, 1 week ago on Azimuth
  14. 3840352.725914
    We introduce a challenge designed to evaluate the capability of Large Language Models (LLMs) in performing mathematical induction proofs, with a particular focus on nested induction. Our task requires models to construct direct induction proofs in both formal and informal settings, without relying on any preexisting lemmas. Experimental results indicate that current models struggle with generating direct induction proofs, suggesting that there remains significant room for improvement.
    Found 1 month, 1 week ago on Koji Mineshima's site
  15. 3848507.725919
    On its surface, a sentence like If Laura becomes a zombie, she wants you to shoot her looks like a plain conditional with the attitude want in its consequent. However, the most salient reading of this sentence is not about the desires of a hypothetical zombie- Laura. Rather, it asserts that the actual, non-zombie Laura has a certain restricted attitude: her present desires, when considering only possible states of affairs in which she becomes a zombie, are such that you shoot her. This can be contrasted with the shifted reading about zombie-desires that arises with conditional morphosyntax, e.g., If Laura became a zombie, she would want you to shoot her. Furthermore, as Blumberg and Holguín (J Semant 36(3):377–406, 2019) note, restricted attitude readings can also arise in disjunctive environments, as in Either a lot of people are on the deck outside, or I regret that I didn’t bring more friends. We provide a novel analysis of restricted and shifted readings in conditional and disjunctive environments, with a few crucial features. First, both restricted and shifted attitude conditionals are in fact “regular” conditionals with attitudes in their consequents, which accords with their surface-level appearance and contrasts with Pasternak’s (The mereology of attitudes, Ph.D. thesis, Stony Brook University, Stony Brook, NY, 2018) Kratzerian approach, in which the if -clause restricts the attitude directly. Second, whether the attitude is or is not shifted—i.e., zombie versus actual desires—is dependent on the presence or absence of conditional morphosyntax. And third, the restriction of the attitude is effected by means of aboutness, a concept for which we provide two potential Kai von Fintel and Robert Pasternak are listed alphabetically and share joint lead authorship of this work.
    Found 1 month, 1 week ago on Kai von Fintel's site
  16. 3963562.725924
    Today I want to make a little digression into the quaternions. We won’t need this for anything later—it’s just for fun. But it’s quite beautiful. We saw in Part 8 that if we take the spin of the electron into account, we can think of bound states of the hydrogen atom as spinor-valued functions on the 3-sphere. …
    Found 1 month, 2 weeks ago on Azimuth
  17. 4436896.725929
    The paper proposes and studies new classical, type-free theories of truth and determinateness with unprecedented features. The theories are fully compositional, strongly classical (namely, their internal and external logics are both classical), and feature a defined determinateness predicate satisfying desirable and widely agreed principles. The theories capture a conception of truth and determinateness according to which the generalizing power associated with the classicality and full compositionality of truth is combined with the identification of a natural class of sentences – the determinate ones – for which clear-cut semantic rules are available. Our theories can also be seen as the classical closures of Kripke-Feferman truth: their ω-models, which we precisely pinned down, result from including in the extension of the truth predicate the sentences that are satisfied by a Kripkean closed-off fixed point model. The theories compare to recent theories proposed by Fujimoto and Halbach, featuring a primitive determinateness predicate. In the paper we show that our theories entail all principles of Fujimoto and Halbach’s theories, and are proof-theoretically equivalent to Fujimoto and Halbach’s CD . We also show establish some negative results on Fujimoto and Halbach’s theories: such results show that, unlike what happens in our theories, the primitive determinateness predicate prevents one from establishing clear and unrestricted semantic rules for the language with type-free truth.
    Found 1 month, 2 weeks ago on Carlo Nicolai's site
  18. 4743541.725934
    In Part 4 we saw that the classical Kepler problem—the problem of a single classical particle in an inverse square force—has symmetry under the group of rotations of 4-dimensional space Since the Lie algebra of this group is we must have conserved quantities and corresponding to these two copies of The physical meaning of these quantities is a bit obscure until we form linear combinations Then is the angular momentum of the particle, while is a subtler conserved quantity: it’s the eccentricity vector of the particle divided by where the energy is negative for bound states (that is, elliptical orbits) The advantage of working with and is that these quantities have very nice Poisson brackets: This says they generate two commuting symmetries. …
    Found 1 month, 3 weeks ago on Azimuth
  19. 5063200.72594
    Історія логіки – актуальний напрямок досліджень в царині сучасного логічного знання. Такі розвідки, по-перше, сприяють створенню загальної картини еволюції логіки, усвідомленню змін предмета, що їх вона зазнавала як наука і як навчальна дисципліна, а також змін у парадигмальних принципах її історичного розвитку, засадничих правилах побудови логічних теорій та інструментарієві останніх. По-друге, історикологічні дослідження надають можливість виявити те, як логічні концепції впливали на інші наукові дисципліни, передусім філософію та математику. По-третє, історико-логічний аналіз дозволяє розглянути логічну позицію певного автора в широкому історико-філософському контексті, показати, як філософські ідеї впливали на розвиток логічного знання. По-четверте, дослідження в царині історії логіки допомагають розглянути її в широкому історико-культурному контексті, з’ясувати взаємовплив різних логічних поглядів та певних культурних традицій і особливостей історичних епох.
    Found 1 month, 3 weeks ago on Heinrich Wansing's site
  20. 5063242.725949
    We present a logic which deals with connexive exclusion. Exclusion (also called “co-implication”) is considered to be a propositional connective dual to the connective of implication. Similarly to implication, exclusion turns out to be non-connexive in both classical and intuitionistic logics, in the sense that it does not satisfy certain principles that express such connexivity. We formulate these principles for connexive exclusion, which are in some sense dual to the well-known Aristotle’s and Boethius’ theses for connexive implication. A logical system in a language containing exclusion and negation can be called a logic of connexive exclusion if and only if it obeys these principles, and, in addition, the connective of exclusion in it is asymmetric, thus being different from a simple mutual incompatibility of propositions. We will develop a certain approach to such a logic of connexive exclusion based on a semantic justification of the connective in question. Our paradigm logic of connexive implication will be the connexive logic C, and exactly like this logic the logic of connexive exclusion turns out to be contradictory though not trivial.
    Found 1 month, 3 weeks ago on Heinrich Wansing's site
  21. 5293983.725957
    Levy’s Upward Theorem says that the conditional expectation of an integrable random variable converges with probability one to its true value with increasing information. In this paper, we use methods from effective probability theory to characterise the probability one set along which convergence to the truth occurs, and the rate at which the convergence occurs. We work within the setting of computable probability measures defined on computable Polish spaces and introduce a new general theory of effective disintegrations. We use this machinery to prove our main results, which (1) identify the points along which certain classes of effective random variables converge to the truth in terms of certain classes of algorithmically random points, and which further (2) identify when computable rates of convergence exist. Our convergence results significantly generalize earlier results within a unifying novel abstract framework, and there are no precursors of our results on computable rates of convergence. Finally, we make a case for the importance of our work for the foundations of Bayesian probability theory.
    Found 1 month, 4 weeks ago on Simon M. Huttegger's site
  22. 5390382.725963
    There are four well-known models of fundamental objective probabilistic reality: classical probability, comparative probability, non-Archimedean probability, and primitive conditional probability. I offer two desiderata for an account of fundamental objective probability, comprehensiveness and non-superfluity. It is plausible that classical probabilities lack comprehensiveness by not capturing some intuitively correct probability comparisons, such as that it is less likely that 0 = 1 than that a dart randomly thrown at a target will hit the exact center, even though both classically have probability zero. We thus want a comparison between probabilities with a higher resolution than we get from classical probabilities. Comparative and non-Archimedean probabilities have a hope of providing such a comparison, but for known reasons do not appear to satisfy our desiderata. The last approach to this problem is to employ primitive conditional probabilities, such as Popper functions, and then argue that P(0 = 1 | 0 = 1 or hit center) = < 1 = P (hit center | 0 = 1 or hit center). But now we have a technical question: How can we reconstruct a probability comparison, ideally satisfying the standard axioms of comparative probability, from a primitive conditional probability? I will prove that, given some plausible assumptions, it is impossible to perform this task: conditional probabilities just do not carry enough information to define a satisfactory comparative probability. The result is that of the models, no one satisfies our two desiderata. We end by briefly considering three paths forward.
    Found 2 months ago on PhilSci Archive
  23. 5555420.725968
    This is probably an old thing that has been discussed to death, but I only now noticed it. Suppose an open future view on which future contingents cannot have truth value. What happens to entailments? …
    Found 2 months ago on Alexander Pruss's Blog
  24. 5799733.725974
    We establish the equivalence of two much debated impartiality criteria for social welfare orders: Anonymity and Permutation Invariance. Informally, Anonymity says that, in order to determine whether one social welfare distribution w is at least as good as another distribution v, it suffices to know, for every welfare level, how many people have that welfare level according to w and how many people have that welfare level according to v. Permutation Invariance, by contrast, says that, to determine whether w is at least as good as v, it suffices to know, for every pair of welfare levels, how many people have that pair of welfare levels in w and v respectively.
    Found 2 months ago on Jeremy Goodman's site
  25. 5995080.725979
    A recent result from theoretical computer science provides for the verification of answers to the Halting Problem, even when there is no plausible means by which to derive those answers using a bottom-up approach. We argue that this result has profound implications for the existence of strongly emergent phenomena. In this work we develop a computer science-based framework for thinking about strong emergence and in doing so demonstrate the plausibility of strongly emergent phenomena existing in our universe. We identify six sufficient criteria for strong emergence and detail the actuality of five of the six criteria. Finally, we argue for the plausibility of the sixth criterion by analogy and a case study of Boltzmann brains (with additional case studies provided in the appendices.)
    Found 2 months, 1 week ago on PhilSci Archive
  26. 6081503.726004
    This paper proposes a dynamic temporal logic that is appropriate for modeling the dynamics of scientific knowledge (especially in historical sciences, such as Archaeology, Paleontology and Geology). For this formalization of historical knowledge, the work is divided into two topics: firstly, we define a temporal branching structure and define the terms for application in Philosophy of Science; Finally, we define a logical system that consists of a variation of Public Announcement Logic in terms of temporal logic, with appropriate rules in a tableaux method.
    Found 2 months, 1 week ago on PhilSci Archive
  27. 6167808.726009
    In this paper, I develop a “safety result” for applied mathematics. I show that whenever a theory in natural science entails some non-mathematical conclusion via an application of mathematics, there is a counterpart theory that carries no commitment to mathematical objects, entails the same conclusion, and the claims of which are true if the claims of the original theory are “correct”: roughly, true given the assumption that mathematical objects exist. The framework used for proving the safety result has some advantages over existing nominalistic accounts of applied mathematics. It also provides a nominalistic account of pure mathematics.
    Found 2 months, 1 week ago on PhilSci Archive
  28. 6670552.726015
    There is a longstanding puzzle about empty names. On the one hand, the principles of classical logic seem quite plausible. On the other hand, there would seem to be truths involving empty names that require rejecting certain classically valid principles.
    Found 2 months, 2 weeks ago on Michael Caie's site
  29. 6670602.726023
    Consider the property of being something that is identical to Hesperus. For short, call this the property of being Hesperus. What is the nature of this property? How does it relate to the property of being Phosphorus? And how do these properties relate to the purely haecceitistic property of being v—the unique thing that has the property of being Hesperus and the property of being Phosphorus?
    Found 2 months, 2 weeks ago on Michael Caie's site
  30. 6748843.726031
    Casajus (J Econ Theory 178, 2018, 105–123) provides a characterization of the class of positively weighted Shapley value for …nite games from an in…nite universe of players via three properties: e¢ ciency, the null player out property, and superweak differential marginality. The latter requires two players’payoffs to change in the same direction whenever only their joint productivity changes, that is, their individual productivities stay the same. Strengthening this property into (weak) differential marginality yields a characterization of the Shapley value. We suggest a relaxation of superweak differential marginality into two subproperties: (i) hyperweak differential marginality and (ii) superweak differential marginality for in…nite subdomains. The former (i) only rules out changes in the opposite direction. The latter (ii) requires changes in the same direction for players within certain in…nite subuniverses. Together with e¢ ciency and the null player out property, these properties characterize the class of weighted Shapley values.
    Found 2 months, 2 weeks ago on André Casajus's site