-
1778177.457399
This work shows that the ontic-epistemic dichotomy is insufficient to capture the different levels of ignorance and their implications for probability theories. It proposes an essentially epistemic interpretation of quantum mechanics, built on an operational basis firmly anchored to experimental data and scientific methods. This approach enables a rigorous treatment of numerical values obtained from experiments without resorting to unnecessary ontological or metaphysical assumptions.
-
1778196.457455
The standard formalism of quantum theory is derived by analyzing the behavior of single-variable physical systems. These systems, which have a minimal information capacity of only one piece of information, exhibit indeterministic behavior under independent measurements but can be described probabilistically for dependent measurements. By enforcing the principle of probability conservation in the transformations of outcome probabilities across various measurement scenarios, we derive the core components of standard quantum theory, including the Born rule, the Hilbert space structure, and the Schrödinger equation. Furthermore, we demonstrate that the requirements for conducting quantum experiments – specifically, preparing physical systems in coherent states – effectively reduce the number of independent variables to one, thereby transforming these systems into single-variable ones in practice. This completes our first-principles, information-theoretic derivation of quantum theory as the physics of single-variable physical systems.
-
1822562.457464
Regular readers may know that I’ve been interested in epistocracy for quite some time now. Epistocracy is a political regime in which political power is allocated according to criteria of competence and knowledge. …
-
1832464.45747
Last week I reblogged a post from 2023 where I began a discussion of a topic in a paper by Gardiner and Zaharatos (2022) (G & Z). G & Z fruitfully trace out connections between the severity requirement and the notion of sensitivity in epistemology. …
-
1871876.457479
It is shown that one common formulation of Stalnaker’s semantics for conditionals is incomplete: it has no sound and (strongly) complete proof system. At first, this seems to conflict with well-known completeness results for this semantics (e.g., Stalnaker and Thomason 1967, Stalnaker 1970 and Lewis 1973, ch. 6). As it turns out, it does not: these completeness results rely on another closely-related formulation of the semantics that is provably complete. Specifically, the difference comes down to how the Limit Assumption is stated. I close with some remarks about what this means for the logic of conditionals.
-
1951138.457485
In recent decades, a gap between two kinds of physical reasoning has opened up. Applied and phenomenological physics show all basic characteristics of canonical 20th century science. But fundamental physics, represented by high energy physics model building, quantum gravity and cosmology, faces substantially new challenges that influence the nature of the scientific process. Those shifts can be expected to become even more conspicuous in the period up to 2050. Exploring their full scope will arguably be an important task for fundamental physics in upcoming decades.
-
1951183.457492
The nature of branching in the many-worlds interpretation (MWI) of quantum mechanics remains an open question, particularly regarding its locality and compatibility with special relativity. This paper challenges the conventional view that branching is either global or local, demonstrating instead that it is nonlocal for entangled systems. Through a new analysis of the EPR-Bohm experiment, I argue that global branching has several potential issues and can hardly be justified. At the same time, I argue that branching cannot be entirely local, as entangled particles exhibit simultaneous, spacelike-separated branching, manifesting an apparent action at a distance within individual worlds. However, while nonlocal branching suggests the emergence of a preferred Lorentz frame within each world, the multiverse as a whole retains full Lorentz invariance, ensuring no superluminal signaling. By refining the ontology of branching and resolving tensions between MWI and relativistic constraints, this analysis may help advance our understanding of quantum nonlocality and also strengthen MWI’s standing as a viable interpretation of quantum mechanics.
-
1964202.457497
Applied category theorists are flocking to AI, because that’s where the money is. I avoid working on it, both because I have an instinctive dislike of ‘hot topics’, and because at present AI is mainly being used to make rich and powerful people richer and more powerful. …
-
2066480.457503
A central question in philosophy of science and epistemology of science concerns the characterization of the progress of science. Many philosophers of science and epistemologists have developed accounts of scientific progress, laying down desiderata for and providing success criteria of any account of scientific progress. Extant accounts of scientific progress are surveyed and critically assessed and it is shown that all face the same problem. The constitution-promotion distinction – a commitment shared by all the accounts – is identified as the root of the problem for the extant accounts. In their place, a novel way of understanding scientific progress – inspired by pragmatic philosophy of science and zetetic epistemology – which rejects the problematic constitution-promotion distinction, and importantly, which provides a vision of scientific progress without depending on the aim of science is developed.
-
2066497.457509
We argue that special and general theories of relativity implicitly assume spacetime events correspond to quantum measurement outcomes. This leads to a change in how one should view the equivalence of spacetime and gravity. We describe a Bell test using time-like measurements that indicates a non classical causal structure that does not violate no-signaling. From this perspective, the violation of the Bell inequalities are already evidence for the non classical structure of flat spacetime as seen by an agent embedded in it. We argue that spacetime geometry can be learned by an embedded agent with internal actuators and sensors making internal measurements.
-
2066521.457514
Performativity refers to the phenomenon that scientific conceptualisations can sometimes change their target systems or referents. A widely held view in the literature is that scientists ought not to deliberately deploy performative models or theories with the aim of eliciting desirable changes in their target systems. This paper has three aims. First, I cast and defend this received view as a worry about autonomy-infringing paternalism and, to that end, develop a taxonomy of the harms it can impose. Second, I consider various approaches to this worry within the extant literature and argue that these offer only unsatisfactory responses. Third, I propose two positive claims. Manipulation of target systems is (a) not inherently paternalist and can be unproblematic, and is (b) sometimes paternalist, but whenever such paternalism is inescapable, it has got to be justifiable. I generalise an example of modelling international climate change coordination to develop this point.
-
2124183.457519
We evaluate the roles general relativistic assumptions play in simulations used in recent observations of black holes including LIGO-Virgo and the Event Horizon Telescope. In both experiments simulations play an ampliative role, enabling the extraction of more information from the data than would be possible otherwise. This comes at a cost of theory-ladenness. We discuss the issue of inferential circularity, which arises in some applications; classify some of the epistemic strategies used to reduce the extent of theory-ladenness; and discuss ways in which these strategies are model independent.
-
2124207.457525
The extraterrestrial hypothesis (ETH), the hypothesis that an extraterrestrial civilization (ETC) is active on Earth today, is taboo in academia, but the assumptions behind this taboo are faulty. Advances in biology have rendered the notion that complex life is rare in our Galaxy improbable. The objection that no ETC would come to Earth to hide from us does not consider all possible alien motives or means. For an advanced ETC, the convergent instrumental goals of all rational agents – self-preservation and the acquisition of resources – would support the objectives of removing existential threats and gathering strategic and non-strategic information.
-
2154428.45753
Concept formation has recently become a widely discussed topic in philosophy under the headings of “conceptual engineering”, “conceptual ethics”, and “ameliorative analysis”. Much of this work has been inspired either by the method of explication or by ameliorative projects. In the former case, concept formation is usually seen as a tool of the sciences, of formal disciplines, and of philosophy. In the latter case, concept formation is seen as a tool in the service of social progress. While recent philosophical discussions on concept formation have addressed natural sciences such as physics as well as various life sciences, so far there is only little direct engagement with the social sciences. To address this shortcoming is important because many debates about socially relevant concepts such as power, gender, democracy, risk, justice, or rationality, may best be understood as engaging in conceptual engineering. This topical collection addresses the nature and structure of concept formation in the natural, the life, and the social sciences alike, both as a process taking place within science and as an activity that aims at a broader impact in society. This helps to understand how concept formation proceeds not only in the natural sciences but also in disciplines such as psychology, cognitive science, political science, sociology and economics.
-
2154445.457536
Did you ever submit a grant proposal to a funding agency? Then, you have likely encountered the request to specify your research method. Anecdotal evidence suggests that philosophers often address this unpopular request by mentioning reflective equilibrium (RE), the method proposed by Goodman (1983 [1954]) and baptized by John Rawls in his “A Theory of Justice” (1971). Appeal to RE has indeed become a standard move in ethics (see, e.g., Daniels, 1996; Swanton, 1992; van der Burg & van Willigenburg, 1998; DePaul, 2011; Mikhail, 2011; Beauchamp & Childress, ). The method has also been referred to in many other branches of philosophy, e.g., in methodological discussions about logic (e.g., Goodman, 1983; Resnik, 1985, , 1997; Brun, 2014; Peregrin & Svoboda, 2017) and theories of rationality (e.g., Cohen, 1981; Stein, 1996). Some philosophers have gone as far as to argue that RE is unavoidable in ethics (Scanlon, 2003) or simply the philosophical method (Lewis, , p. x; Keefe, 2000, ch. 2). The popularity of RE indicates that its key idea resonates well with the inclinations of many philosophers: You start with your initial views or commitments on a theme and try to systematize them in terms of a theory or a few principles. Discrepancies between theory and commitments trigger a characteristic back and forth between the commitments and the theories, in which commitments and theories are adjusted to each other until an equilibrium state is reached.
-
2211286.457541
Nietzsche’s first book was entitled The Birth of Tragedy out
of the Spirit of Music (1872), and one of his very last works was
called The Case of Wagner: A Musician’s Problem (1888). As this simple fact indicates, reflection on art (and especially, on
music and drama) is an abiding and central feature of
Nietzsche’s thought. Indeed, very nearly all of his works
address aesthetic questions at least in passing. Some of these
questions are familiar from the philosophical tradition: e.g., how
should we explain the effect tragedy has on us? What is the relation
of aesthetic value to other kinds of value?
-
2217494.457546
Benacerraf famously argued that no set theoretic reduction can capture the natural numbers. While one might conclude from this that the natural numbers are some kind of sui generis entities, Benacerraf instead opts for a structuralist view on which different things can play the role of different numbers. …
-
2227471.457552
Recently, Dardashti et al. (Stud Hist Philos Sci Part B Stud Hist Philos Mod Phys 67:1–11, 2019) proposed a Bayesian model for establishing Hawking radiation by analogical inference. In this paper we investigate whether their model would work as a general model for analogical inference. We study how it performs when varying the believed degree of similarity between the source and the target system. We show that there are circumstances in which the degree of confirmation for the hypothesis about the target system obtained by collecting evidence from the source system goes down when increasing the believed degree of similarity between the two systems. We then develop an alternative model in which the direction of the variation of the degree of confirmation always coincides with the direction of the believed degree of similarity. Finally, we argue that the two models capture different types of analogical inference.
-
2228477.457557
Journal of the American Philosophical Association () – © The Author(s), . Published by Cambridge University Press on behalf of the American Philosophical Association. This is an Open Access article, distributed under the terms of the Creative Commons Attribution licence (http://creativecommons.org/ licenses/by/.), which permits unrestricted re-use, distribution and reproduction, provided the original article is properly cited.
-
2230856.457565
1. Strong and weak notions of erasure are distinguished according to whether the single erasure procedure does or does not leave the environment in the same state independently of the pre-erasure state. 2. Purely thermodynamic considerations show that strong erasure cannot be dissipationless. 3. The main source of entropy creation in erasure processes at molecular scales is the entropy that must be created to suppress thermal fluctuations (“noise”). 4. A phase space analysis recovers no minimum entropy cost for weak erasure and a positive minimum entropy cost for strong erasure. 5. An information entropy term has been attributed mistakenly to pre-erasure states in the Gibbs formalism through the neglect of an additive constant in the “–k sum p log p” Gibbs entropy formula.
-
2239475.457571
Hadfield-Menell et al. (2017) propose the Off-Switch Game, a model of Human-AI cooperation in which AI agents always defer to humans because they are uncertain about our preferences. I explain two reasons why AI agents might not defer. First, AI agents might not value learning. Second, even if AI agents value learning, they might not be certain to learn our actual preferences.
-
2239494.457576
The changes that quantum states undergo during measurement are both probabilistic and nonlo-cal. These two characteristics complement one another to insure compatibility with relativity and maintain conservation laws. Nonlocal entanglement relations provide a means to enforce conservation laws in a probabilistic theory, while the probabilistic nature of nonlocal effects prevents the superluminal transmission of information. In order to explain these measurement-induced changes in terms of fundamental physical processes it is necessary to take these two key characteristics into account. One way to do this is to modify the Schrodinger equation by adding stochastic, nonlinear terms. A number of such proposals have been made over the past few decades. A recently proposed equation based on the assumption that wave function collapse is induced by a sequence of correlating interactions of the kind that constitute measurements has been shown to maintain strict adherence to conservation laws in individual instances, and has also eliminated the need to introduce any new, ad hoc physical constants. In this work it is shown that the stochastic modification to the Schrodinger equation is Lorentz invariant. It is further argued that the additional spacetime structure that it requires provides a way to implement the assumption that spacelike-separated operators (and measurements) commute, and that this assumption of local commutativity should be regarded as a third postulate of relativity.
-
2239513.457583
This essay is a two-step reflection on the question ‘Which events (can be said to) occur in quantum phenomena?’ The first step regiments the ontological category of statistical phenomena and studies the adequacy of probabilistic event models as descriptions thereof. Guided by the conviction that quantum phenomena are to be circumscribed within this same ontological category, the second step highlights the peculiarities of probabilistic event models of some non-relativistic quantum phenomena, and thereby of what appear to be some plausible answers to our initial question. The reflection ends in an aporetic state, as it is by now usual in encounters between ontology and the quantum.
-
2280542.457588
The inference pattern known as disjunctive syllogism (DS) appears as a derived rule in Gentzen’s natural deduction calculi NI and NK. This is a paradoxical feature of Gentzen’s calculi in so far as DS is sometimes thought of as appearing intuitively more elementary than the rules ∨E, ¬E, and EFQ that figure in its derivation. For this reason, many contemporary presentations of natural deduction depart from Gentzen and include DS as a primitive rule. However, such departures violate the spirit of natural deduction, according to which primitive rules are meant to relationally define logical connectives via universal properties (§2). This situation raises the question: Can disjunction be relationally defined with DS instead of with Gentzen’s ∨I and ∨E rules? We answer this question in the affirmative and explore the duality between Gentzen’s definition and our own (§3). We argue further that the two universal characterizations, rather than provide competing relational definitions of a single disjunction operator, disambiguate natural language’s “or” (§4). Finally, this disambiguation is shown to correspond exactly with the additive and multiplicative disjunctions of linear logic (§5). The hope is that this analysis sheds new light on the latter connective, so often deemed mysterious in writing about linear logic.
-
2295223.457594
Neil Mehta has written a fantastic book. A Pluralist Theory of Perception develops a novel theory of perception that illuminates the metaphysical structure, epistemic significance, and semantic role of perceptual consciousness. By and large, I found the core tenets of Mehta’s theory to be highly plausible and successfully defended. I could quibble with some parts (e.g., his claim that our conscious awareness of sensory qualities is non-representational). But I suspect our disagreements are largely verbal, and where they are non-verbal, they are minor. Instead of focusing on disagreements, in this commentary I wish to explore the metaphysical ramifications of Mehta’s theory with respect to the mind-body problem. Mehta has a great deal to say about the metaphysics of perception. Much of it seems to me to be in tension with physicalism. But throughout the book he remains officially neutral on the truth of physicalism, “in reflection of [his] genuine uncertainty” (ibid: 100). I will try to show that Mehta’s commitments lead almost inexorably to dualism (or, at least, away from physicalism) by giving three arguments against physicalism that centrally rely on premises to which Mehta is committed.
-
2297174.457602
If the philosophy of mathematics wants to be rigorous, the concept of infinity must stop being equivocal (both potential and actual) as it currently is. The conception of infinity as actual is responsible for all the paradoxes that compromise the very foundation of mathematics and is also the basis on which Cantor's argument is based on the non-countability of R, and the existence of infinite cardinals of different magnitude. Here we present proof that all infinite sets (in a potential sense) are countable and that there are no infinite cardinals.
-
2354843.457608
The philosophical literature on mathematical structuralism and its history has focused on the emergence of structuralism in the 19th century. Yet modern abstractionist accounts cannot provide an historical account for the abstraction process. This paper will examine the role of relations in the history of mathematics, focusing on three main epochs where relational abstraction is most prominent: ancient Greek, 17th and 19th centuries, to provide a philosophical account for the abstraction of structures. Though these structures emerged in the 19th century with definitional axioms, the need for such axioms in the abstraction process comes about, as this paper will show, after a series of relational abstractions without a suitable basis.
-
2408419.457615
This paper argues for a unified account of semantic and pragmatic infelicity. It is argued that an utterance is infelicitous when it communicates an inconsistent set of propositions, given the context. In cases of semantic infelicity the relevant utterance expresses a set of inconsistent propositions, whereas pragmatic infelicity is a matter of the utterance conflicting with contextual expectations or assumptions. We spell out this view within the standard framework according to which a central aim of communication is to update a body of information shared among the participants. We show that this account explains different kinds of infelicity for both declarative and non-declarative utterances. Further, the account is seen to make correct predictions for a range of cases involving irony, joking, and related non-assertoric utterances.
-
2409036.45762
My guess is that most of you have never read Friedrich Nietzsche’s Thus Spoke Zarathustra. While it offers very few actual arguments, it’s some of my all-time favorite poetry. To sell you, here is perhaps my favorite chapter, “The Preachers of Death.”
By the way, I know scholars disfavor this translation. …
-
2412527.457625
In theory, replication experiments purport to independently validate claims from previous research or provide some diagnostic evidence about their truth value. In practice, this value of replication experiments is often taken for granted. Our research shows that in replication experiments, practice often does not live up to theory. Most replication experiments involve confounding factors and their results are not uniquely determined by the treatment of interest, hence are uninterpretable. These results can be driven by the true data generating mechanism, limitations of the original experimental design, discrepancies between the original and the replication experiment, distinct limitations of the replication experiment, or combinations of any of these factors. Here we introduce the notion of minimum viable experiment to replicate which defines experimental conditions that always yield interpretable replication results and is replication-ready. We believe that most reported experiments are not replication-ready and before striving to replicate a given result, we need theoretical precision in or systematic exploration of the experimental space to discover empirical regularities.